GStreamer 1.0 examples for iOS, Android and in general

As the folks at gstreamer.com (not to be confused with the GStreamer project) are still at the old and unmaintained GStreamer 0.10 release series, I started to port all their tutorials and examples to 1.x. You can find the code here: http://cgit.freedesktop.org/~slomo/gst-sdk-tutorials/

This includes the generic tutorials and examples, and ones for iOS and Android. Over the past months many people wanted to try the 1.x binaries for iOS and Android and were asking for examples how to use them. Especially the fourth and fifth tutorials should help to get people started fast, you can find them here (Android) and here (iOS).

If there are any problems with these, please report them to myself or if you suspect any GStreamer bugs report them in Bugzilla. The XCode OS X project files and the Visual Studio project files are ported but I didn’t test them, please report if they work 🙂

228 thoughts on “GStreamer 1.0 examples for iOS, Android and in general”

  1. Hi,

    fist I will thank you for the 1.0 example. This solved the mp3 playback problem for me.
    Maybe you can help me to resolve some other problems I have.

    1.) Logs from the native code “GST_DEBUG()” is not visible at the eclipse logcat window, but “__android_log_print()” works very well. Also the creation of a dot file does not work.

    2.) I want to use goom for audiovisualisation but if I try something linke this “http://docs.gstreamer.com/display/GstSDK/Playback+tutorial+6%3A+Audio+visualization” I didn’t find a visualization plugin and I dont know how to add the needed plugin

    I hope you can help me
    Many thanks in advance Thomas

    1. For 1), you can enable logs by calling gst_debug_set_default_threshold() to enable it for all categories, or gst_debug_set_threshold_for_name() to set it for a single category.

      For 2), just add goom to your plugin list in Android.mk or $(GSTREAMER_PLUGINS_VIS)

      1. Hi,
        Thank you for your source code!
        Last year I have tested your code using Eclipse on Ubuntu.
        But now I need to use Android Studio on Windows/Ubuntu to do some work with Gstreamer. However, I failed to set up the developing environment both on Win or Ubuntu. So, would you be so kind to help me so solve some problems? Thank you very much.
        On Win7, I use Android Studio2.3 and ndk-r9d. The test code is your tutorial-5 and the log is :
        —————————
        D:/android-ndk/android-ndk-r9d/toolchains/arm-linux-androideabi-4.6/prebuilt/windows-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld.gold.exe: gst-build-armeabi/gstreamer_android.o: in function gst_android_register_static_plugins:gstreamer_android.c(.text.gst_android_register_static_plugins+0x44): error: undefined reference to ‘gst_plugin_autodetect_register’
        ——————————-

  2. hi, i use android tutorial 5, play hd video 1080p framerate 24, bitrate 3000kbps, but playing bad, seems unnormal, not smooth…
    but 720p ok.
    how can i improve it, thx.

      1. hi, thanks for your help.
        I am a gstreamer newer, i have some question,:
        1. i found you use playbin instead of playbin2 at android tutorial 5 when GStreamer 1.0, why playbin2 not used?
        2. http://gstreamer.freedesktop.org/data/pkg/android/, the latest android release is 1.2.4, but now release is 1.3.2, and I can’t use your latest androidmedia plugin at 1.2.4? (more, The eglglessink element was removed and replaced by the glimagesink element)
        3. can i custom playbin’s videosink with glimagesink when 1.2.4?
        4. i find androidmedia plugin add some new: gstamcsurface and gstamcsurfacetexture, and so on ,
        Is there some example that i can study how to use them?

      2. 1) There is no playbin2 in GStreamer 1.x. What was playbin2 in 0.10 is now playbin
        2) androidmedia was available in 1.2.4 too. But if you want the latest changes and also apply that patch there you have to build GStreamer 1.3 yourself (had no time to create binaries yet). You can use cerbero for that
        3) Yes, it works the same way as eglglessink from the application point of view
        4) You don’t 🙂 A pipeline with the androidmedia decoders and glimagesink (with all those patches from the bug) will make use of them automatically

  3. Hello Slomo,
    I installed 1.4.0 package and update tutorial fromm your sites.
    It’s complied well. However, I recognize that videos can play but some video only have sound. Even videos from iDevice records also has only audio, not video. I see turotial 5 using the overlay is ‘GST_VIDEO_OVERLAY’. Is it problem? In older version had ‘GST_X_OVERLAY’, it worked well but not supported now.
    How can I fix it, or choose right version to use (I’ve have some error when use older version now).
    Thanks in advance :).

    1. What’s the container format and the used codecs in those files that only have sound? Did they work before? And which codecs did you enable in gst_ios_main.h, only the default ones that are in the GIT repository?
      And do you access them as local files or via some streaming protocol? Which?

      1. Hello Slomo,
        Sorry for late reply. The file format I had problem are come from extension MOV, and from videos are recorded by iPad (may be M4V format). I also test it with local files and streaming in internet, the same problem. I just append some direct link to tutorial 5 and I don’t know clearly about codec you consider in gst_ios_main.h. (I just use tutorial 5 with default codecs, play with ‘playbin’, I think so).
        Example link only have sound: http://www.wowza.com/_h264/BigBuckBunny_115k.mov
        Looking forward to hearing from you.
        Thanks in advance.

  4. Hi,
    I’m new to Gstreamer but I’d like to use it for an Android project. I’ve downloaded the gstreamer Android SDK (including gstreamer version 0.10) from gstreamer.com, the gstreamer1.0 .zip for android from gstreamer.freedesktop.org and your tutorials.

    I’ve managed to run the 0.10 tutorials but I don’t quite understand what changes I need to apply to run your tutorials and use gstreamer1.0 in Android. Do I need to replace some folders in the gstreamer-SDK or not use the SDK at all?
    Where should GSTREAMER_ROOT_ANDROID point to? The gstreamer-SDK-folder or the unziped gstreamer1.0 folder? Do I need to create new plugins.mk files or use the one provided within the SDK?

    Sorry for the dumb questions. Thanks in advance for your help.
    Kind regards, Joey

    1. GSTREAMER_ROOT should point to the unzipped GStreamer directory, the one that contains the lib and share directories. You don’t need to create the plugins.mk yourself but just use the one that comes with the binaries.

      Basically you only need a directory structure like that of the tutorials: jni/*.c, jni/Android.mk, AndroidManifest.xml, src/**/*.java

      1. Thank you for your reply.
        Unfortunately there’s no plugins.mk within the binaries…let me recap what I’ve done so far:

        1. I’ve downloaded the SDK from gstreamer.com and unzipped it to, say, gstreamer-sdk. Inside this folder there is an include, lib and share folder and somewhere inside the share folder there’s a plugins.mk. But e.g. lib contains a sub-folder gstreamer0.10 and I want to use 1.0

        2. I’ve downloaded gstreamer-1.0-android-arm-1.4.0-debug-runtime.zip from http://gstreamer.freedesktop.org/data/pkg/android/1.4.0/ and unzipped it to gstreamer1.0. This folder contains bin, etc, lib, lib, exc, share. But non of these folders contain a plugins.mk. What do I need to do with this folder or zip?

        If I let GSTREAMER_ROOT point to the gstreamer-sdk folder plugins.mk etc. are found but gstreamer-1.0.mk (line 23 Android.mk Tutorial 5) is not.
        Additionally the import org.freedesktop.gstreamer.Gstreamer in Tutorial5.java cannot be resolved.
        It feels like I’m missing an important step. Thanks for your help. Greatly appreciated.

      2. 1) Don’t use gstreamer.com, that’s obsolete and not connected in any way to the GStreamer project
        2) You don’t want the “runtime” variant of the binaries but the one without. The runtime bits only contain executables you could use on rooted phones. You usually never want the “runtime” variant 🙂

      3. Oh well, that explains a lot 🙂 thank you.
        My last question would be: Can I use avdec_h264 on Android? I get an error message saying there’s no such element. It’s not in plugins.mk as well…

        Kind regards.

  5. Hi,

    is it possible to play audio over an usb device on android? The usb device will be recognized as audio input on windows and it is recognized as usb soundcard from the “UsbAudioTester” app on Google Play.

    Thanks
    Thomas

    1. Can it be used by the standard Android apps? If not then it will probably be quite some work to get that running, otherwise it should work already.

      1. What standard Android apps do you mean? If you tell me one I will try it. The app “UsbAudioTester” from the Goggle Play Store recognizes it.

        Can you please give me an example how to use USB devices as source with GStreamer.
        Can I use gstreamer 1.4 or do I need 0.10? Do I need extra plugins or do I only need the “http://gstreamer.freedesktop.org/data/pkg/android/1.4.0/gstreamer-1.0-android-arm-1.4.0-debug.zip”?

      2. I meant standard apps like the music player, or anything producing sound really. If none of these can use your USB audio, it will probably be quite some work. Best to check the Internet if someone did such a thing already then, or how the USB device is made available to the apps, if at all.

      3. I don’t have standard apps where I can select another audio input but I found that there is a folder called “/proc/asound” in which ther is a file called “device”. I think that this means that alsa is supported. The “device” file includes my usb audio input device when it is connected and gstreamer also supports alsasrc. I hope this is the solution to get my device working.
        Is the alsasrc alos supported on android or are there limitations on the android version?
        Can anybody tell my how to use alsasrc?

        Thanks

      4. ALSA is not public API on Android, so it might work or not depending on the device you have. That’s also why we don’t include the alsa plugin in the binary releases. You’ll have to compile it yourself, but in general it works well on Android devices that allow you to use ALSA.

      5. Hmm. Looks like a little bit too difficult form me :/
        I think I need some help to do that

      6. I have now created an VMWare with Ubuntu 12.04 and tried to compile the GStreamer SDK with cerbero but I always get the same error.

        If I trie to use “python cerbero-uninstalled -c config/cross-android.cbc bootstrap” or “python cerbero-uninstalled bootstrap” I always get the following error

        checking for Minix Amsterdam compiler… no
        configure: error: cannot run /bin/bash build-aux/config.sub
        Running command ‘./configure –prefix /home/notroot/cerbero/build-tools –libdir /home/notroot/cerbero/build-tools/lib –disable-maintainer-mode –disable-silent-rules –enable-introspection ‘
        Recipe ‘m4’ failed at the build step ‘configure’

        If Itry to use “python cerbero-uninstalled -c config/cross-android.cbc buildone gstreamer-1.0-static” I get the following error

        + skipping configure stage for package gstreamer, as requested.
        + autogen.sh done.
        configure: WARNING: if you wanted to set the –build type, don’t use –host.
        If a cross compiler is detected then cross compile mode will be used
        configure: error: cannot run /bin/bash ./config.sub
        Running command ‘sh ./autogen.sh –noconfigure && ./configure –prefix /home/notroot/cerbero/dist/android_arm –libdir /home/notroot/cerbero/dist/android_arm/lib –enable-introspection=no –disable-examples –enable-static-plugins –disable-shared –enable-static –disable-gtk-doc –disable-docbook –disable-gtk-doc –enable-static –disable-maintainer-mode –disable-silent-rules –disable-introspection –host=arm-linux-androideabi’
        Recipe ‘gstreamer-1.0-static’ failed at the build step ‘configure’

        Seems there is always a problem at the configure step.
        Can you please help me

  6. Hi,

    thanks for this post and the links too. I am planning to use gstreamer for android , can i use the demo that you just provided in the link tutorial5? I wish to play rtmp streams with extension mp4. Will that work ? Could you please guide me as i am a noob in gstreamer ? Thanks

  7. Hello!
    i am use tutorial 3 on ios. It is work fine on iphone5, but on ipad2&3 is to slow.
    mi piplene is “udpsrc port=3000 do-timestamp=true typefind=true buffer-size=500000 caps=”application/x-rtp, media=video, clock-rate=90000, encoding-name=H264, payload=96″ ! rtpjitterbuffer name=rtpjitterbuffer latency=250 drop-on-latency=true mode=buffer ! rtph264depay ! avdec_h264 ! autovideosink”
    Video resolution is 720p
    What i need to optimise app or pipeline? Or both ;)?

    1. Most likely the CPU is too slow for decoding the video. First step would be to confirm that and profile your application, then optimize the part that is confirmed to be too slow.

      1. Apple announced video toolbox API for hardware decoding/encoding video in IOS8.
        now on apple developer site available headers and demo app.
        We can use it in future version gstreamer?

      2. i am finish time profiling on iphone5
        57 % – h264_decode_frame
        23 % – draw&convert and all other

        memory leaks test shows some leaks at init plugins when translations begin.
        97% – orc_compiler_error_valist – deep inside gst_pluginregister_func
        1.4% – ffmpeg_demux_register

        open gl analyses test show some warnings like this:
        1 v2r-player gst_gl_context_eagl_create_context /Users/slomo/cerbero/sources/ios_universal/armv7/gst-plugins-bad-1.0-1.3/gst-libs/gst/gl/eagl/gstglcontext_eagl.m:166
        “This command was redundant:

        glBindFramebuffer(GL_FRAMEBUFFER, 1u)
        A GL function call that sets a piece of GL state to its current value has been detected. Minimize the number of these redundant state calls, since these calls are performing unnecessary work.”

      1. i have XCode6-beta.app installed, but cerbero show me error.

        “Mac-mini-Aleksandr:cerbero sovest$ ./cerbero-uninstalled bootstrap
        Traceback (most recent call last):
        File “./cerbero/utils/__init__.py”, line 258, in parse_file
        execfile(filename, dict)
        File “/Users/sovest/cerbero/config/darwin.config”, line 56, in module
        raise Exception(“OSX SDK not found, please install XCode”)
        Exception: OSX SDK not found, please install XCode
        Configuration Error: Could not include config file (/Users/sovest/cerbero/config/darwin.config)”

      2. I did not port the OSX parts for XCode 6 yet, will do that today. After that this should run. Then you only need to change the mininumum version in config/cross-ios-armv7.cbc to 8.0 and use that config (universal and x86 build don’t work with XCode 6 yet, Apple is good at breaking stuff).

      3. ./cerbero-uninstalled -c ./config/cross-ios-arm7.cbc build gstreamer-1.0 – build done!
        But how build framework for ios8?

      4. Set the minimum version to 8.0 in config/cross-ios-armv7.cbc. As you might notice this is all still work in progress. Unless you plan to help with that it’s probably better to just wait a few more days 🙂

      5. ./cerbero-uninstalled -c ./config/cross-ios-arm7.cbc build gstreamer-1.0 – This command build framework????? Or no?

      6. “package” instead of “build”, but the resulting framework has a little bug that has to be fixed after installation currently.

      7. Mac-mini-Aleksandr:cerbero sovest$ ./cerbero-uninstalled -c ./config/cross-ios-arm7.cbc package gstreamer-1.0
        Building the following recipes: gettext libiconv libffi zlib glib gtk-doc-lite gstreamer-1.0 libxml2 libogg libpng pixman expat bzip2 freetype fontconfig cairo harfbuzz pango libvorbis libtheora libvisual orc tremor gst-plugins-base-1.0 gst-shell libjpeg-turbo speex tiff gdk-pixbuf gmp nettle libtasn1 gnutls glib-networking libsoup wavpack flac taglib libvpx libdv gst-plugins-good-1.0 fribidi libass faad2 libkate opus libgpg-error libgcrypt librtmp schroedinger libdca libmms soundtouch vo-aacenc libcroco librsvg openjpeg gst-plugins-bad-1.0 a52dec opencore-amr libmpeg2 libmad x264 gst-plugins-ugly-1.0 gstreamer-ios-templates gstreamer-1.0-static gst-plugins-base-1.0-static gst-plugins-good-1.0-static gst-plugins-bad-1.0-static gst-plugins-ugly-1.0-static gst-rtsp-server-1.0 glib-networking-static gst-libav-1.0 gst-libav-1.0-static gnonlin-1.0 gst-validate gst-editing-services-1.0 gnonlin-1.0-static
        [(1/78) gettext -> already built ]
        [(2/78) libiconv -> already built ]
        [(3/78) libffi -> already built ]
        [(4/78) zlib -> already built ]
        [(5/78) glib -> already built ]
        [(6/78) gtk-doc-lite -> already built ]
        [(7/78) gstreamer-1.0 -> already built ]
        [(8/78) libxml2 -> already built ]
        [(9/78) libogg -> already built ]
        [(10/78) libpng -> fetch ]
        —–> Step done
        [(10/78) libpng -> extract ]
        —–> Step done
        [(10/78) libpng -> configure ]
        —–> Step done
        [(10/78) libpng -> compile ]
        /Applications/Xcode.app/Contents/Developer/usr/bin/make all-am
        source=’arm/filter_neon.S’ object=’arm/filter_neon.lo’ libtool=yes \
        DEPDIR=.deps depmode=none /bin/sh ./depcomp \
        /bin/sh ./libtool –mode=compile gas-preprocessor.pl clang -DHAVE_CONFIG_H -I. -arch armv7 -mcpu=cortex-a8 -pipe -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.0.sdk -arch armv7 -mcpu=cortex-a8 -pipe -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.0.sdk -miphoneos-version-min=8.0 -Wall -g -Os -Wno-error=format-nonliteral -Wno-unused-command-line-argument -c -o arm/filter_neon.lo arm/filter_neon.S
        libtool: compile: gas-preprocessor.pl clang -DHAVE_CONFIG_H -I. -arch armv7 -mcpu=cortex-a8 -pipe -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.0.sdk -arch armv7 -mcpu=cortex-a8 -pipe -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.0.sdk -miphoneos-version-min=8.0 -Wall -g -Os -Wno-error=format-nonliteral -Wno-unused-command-line-argument -c arm/filter_neon.S -fno-common -DPIC -o arm/.libs/filter_neon.o
        ./libtool: line 1111: gas-preprocessor.pl: command not found
        make[1]: *** [arm/filter_neon.lo] Error 1
        make: *** [all] Error 2
        Running command ‘make’

        Recipe ‘libpng’ failed at the build step ‘compile’
        Select an action to proceed:
        [0] Enter the shell
        [1] Rebuild the recipe from scratch
        [2] Rebuild starting from the failed step
        [3] Skip recipe
        [4] Abort

        if i choose skip action, many other step ends with error.

      8. Oh! Need to use bootstrap with config!!

        Mac-mini-Aleksandr:cerbero sovest$ ./cerbero-uninstalled -c ./config/cross-ios-arm7.cbc bootstrap
        WARNING: No bootstrapper for the distro version ios_8_0
        WARNING: No bootstrapper for the distro version osx_mavericks

        done!

      9. Oh! Need to use bootstrap with config!!

        Mac-mini-Aleksandr:cerbero sovest$ ./cerbero-uninstalled -c ./config/cross-ios-arm7.cbc bootstrap
        WARNING: No bootstrapper for the distro version ios_8_0
        WARNING: No bootstrapper for the distro version osx_mavericks

        done!

        But building package fail on step 22.

        [(22/78) orc -> fetch ]
        —–> Step done
        [(22/78) orc -> extract ]
        —–> Step done
        [(22/78) orc -> configure ]
        —–> Step done
        [(22/78) orc -> compile ]
        /Applications/Xcode.app/Contents/Developer/usr/bin/make all-recursive
        Making all in orc
        make[2]: Nothing to be done for `all’.
        Making all in orc-test
        /bin/sh ../libtool –tag=CC –mode=compile clang -DHAVE_CONFIG_H -I. -I.. -arch armv7 -mcpu=cortex-a8 -pipe -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.0.sdk -Wall -Werror -I.. -D_GNU_SOURCE -DORC_ENABLE_UNSTABLE_API -arch armv7 -mcpu=cortex-a8 -pipe -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.0.sdk -miphoneos-version-min=8.0 -Wall -g -Os -Wno-error=format-nonliteral -MT liborc_test_0.4_la-orctest.lo -MD -MP -MF .deps/liborc_test_0.4_la-orctest.Tpo -c -o liborc_test_0.4_la-orctest.lo `test -f ‘orctest.c’ || echo ‘./’`orctest.c
        libtool: compile: clang -DHAVE_CONFIG_H -I. -I.. -arch armv7 -mcpu=cortex-a8 -pipe -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.0.sdk -Wall -Werror -I.. -D_GNU_SOURCE -DORC_ENABLE_UNSTABLE_API -arch armv7 -mcpu=cortex-a8 -pipe -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.0.sdk -miphoneos-version-min=8.0 -Wall -g -Os -Wno-error=format-nonliteral -MT liborc_test_0.4_la-orctest.lo -MD -MP -MF .deps/liborc_test_0.4_la-orctest.Tpo -c orctest.c -fno-common -DPIC -o .libs/liborc_test_0.4_la-orctest.o
        orctest.c:134:9: error: ‘system’ is deprecated: first deprecated in iOS 8.0 – Use posix_spawn APIs instead. [-Werror,-Wdeprecated-declarations]
        ret = system (cmd);
        ^
        /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.0.sdk/usr/include/stdlib.h:177:6: note: ‘system’ has been explicitly marked deprecated here
        int system(const char *) __DARWIN_ALIAS_C(system) __OSX_AVAILABLE_BUT_DEPRECATED_MSG(__MAC_10_0,__MAC_NA,__IPHONE_2_0,__IPHONE_8_0, “Use posix_spawn APIs instead.”);

        /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.0.sdk/usr/include/stdlib.h:177:6: note: ‘system’ has been explicitly marked deprecated here
        int system(const char *) __DARWIN_ALIAS_C(system) __OSX_AVAILABLE_BUT_DEPRECATED_MSG(__MAC_10_0,__MAC_NA,__IPHONE_2_0,__IPHONE_8_0, “Use posix_spawn APIs instead.”);

  8. i can’t build app for deployment target 8.0 only 6.1 is work.
    when I choose any other deployment target at assembly of the app I receive an error message

      1. Ho! New app from template is builded successfully.
        need only one small fix.
        it is required to add a #import line in top of gst_ios_init.m

      2. gstreamer.framework 1.4.1 & last tutorial form git

        erorr is:
        ld: warning: could not create compact unwind for .LFB3: non-standard register 5 being saved in prolog
        Undefined symbols for architecture i386:
        “std::basic_string<wchar_t, std::char_traits, std::allocator >::find(wchar_t const*, unsigned long, unsigned long) const”, referenced from:
        TagLib::String::find(TagLib::String const&, int) const in GStreamer(libtag_a_i386_-tstring.cpp.o)
        “std::basic_string<wchar_t, std::char_traits, std::allocator >::rfind(wchar_t const*, unsigned long, unsigned long) const”, referenced from:
        TagLib::String::rfind(TagLib::String const&, int) const in GStreamer(libtag_a_i386_-tstring.cpp.o)
        “std::basic_string<wchar_t, std::char_traits, std::allocator >::_M_leak_hard()”, referenced from:
        TagLib::String::String(std::string const&, TagLib::String::Type) in GStreamer(libtag_a_i386_-tstring.cpp.o)
        TagLib::String::String(std::string const&, TagLib::String::Type) in GStreamer(libtag_a_i386_-tstring.cpp.o)
        TagLib::String::prepare(TagLib::String::Type) in GStreamer(libtag_a_i386_-tstring.cpp.o)
        TagLib::String::String(char const*, TagLib::String::Type) in GStreamer(libtag_a_i386_-tstring.cpp.o)
        TagLib::String::String(TagLib::ByteVector const&, TagLib::String::Type) in GStreamer(libtag_a_i386_-tstring.cpp.o)
        TagLib::String::to8Bit(bool) const in GStreamer(libtag_a_i386_-tstring.cpp.o)
        TagLib::String::begin() in GStreamer(libtag_a_i386_-tstring.cpp.o)

      3. That’s a known problem with the C++ libraries on iOS. We build GStreamer with the ones from =7.0 unfortunately. You either have to set =7.0 as deployment target) or disable the taglib plugin.

      4. It seems I start understanding as it occurred.
        I worked over the app in xcode 5 and when opened it in xcode6 received problems.
        If at once to create the new app in xcode6, such problems aren’t present ;(
        I will think how to migrate from 5 versions on 6

  9. He Sebastian, I compiled gstreamer-1.0 for android using cerbero, now I am trying to compile your android tutorial. I was getting this error GST_TYPE_VIDEO_OVERLAY not declared then I change the function from get_bin_get_by_intereface to gst_bin_get_by_name providing videosink’s name. Now I am getting another error jni/tutorial-3.c:282: error: undefined reference to ‘GST_VIDEO_OVERLAY’. I don’t know why this error is coming. Would you please guide me.

    regards
    Moonzai

      1. Thanks Sebastian, I made it, I just included gst/video/videooverlay.h, it compiled 🙂

        regards
        Moonzai

      2. Hey Sebastian,
        I need your help, can you tell me how do I compile gst-omx for Android. cuz I need hardware acceleration for h264 in Android. I searched on internet and came to know that gst-omx is the solution for my problem.

        regards
        Moonzai

      3. It’s most likely not the solution you’re looking for. OpenMAX IL is not a public API on Android, and you’ll need a custom configuration of gst-omx for every single Android device. Look at the androidmedia plugin, which uses the public android.media.MediaCodec API.

        Independent of that, you can compile gst-omx for Android just like any other software… and then have to create a config file like the ones in the config directory of gst-omx for your specific device.

  10. Thank Sebastian for you replies. I appreciate 🙂

    I didn’t find any exmple how to use androidmedia plugin, if you give me some example, that will be very initiative for me,

    regards
    Moonzai

      1. Hey Sebastian,

        I checked with decodebin here is my pipeline
        udpsrc port=1234 caps=\”application/x-rtp, media=(string)video\” ! rtph264depay ! h264parse ! decodebin ! videoconvert ! eglglessink name=videosink

        same result 🙁

  11. Hi,

    I’m trying to switch the selected audio stream with g_object_set (player->pipeline, “current-audio”, streamID, NULL); but the playback will mute after some time instead of switching it. If I switch back to stream 0 it starts playing again after some time.

    Is this a known bug or did I something wrong?

    regards
    Thomas

      1. Alright, sounds like a bug 🙂 Can you report that at http://bugzilla.gnome.org ? That’s better for tracking this kind of stuff than the comments here. There are some known suboptimal things about switching audio/subtitle streams, but it should generally work.

      2. I have now tried to replace the audio decoder and now it works. The buggy decoder was part of the androidmedia plugin. I also had problems with other androidmedia decoders.
        Is it recommended to prevent the androidmedia decoders?

      3. No, they’re supposed to work 🙂 and without you’ll have to include software codecs instead of using the ones that are already on the device. Which device are you using?

      4. I’m using a Galaxy S3.
        For instance I have a file with two aac audio streams and one mpeg4 video stream. For default gstreamer will use the “amcauddec-omxsecaacdec” which throws an error. If I set the rank from the “amcauddec-omxsecaacdec” to NONE it will use the GstFaad decoder and everything works. Disabling the “amcauddec-omxsecaacdec” also solves the audio stream switching problem.
        I also had already problems with the “amcauddec-omxsecamrdec” because gstreamer tried to use this for an aac audio file therefore I also disapled this one????

      5. That clearly should be fixed then 🙂 Can you also provide a debug log with gst_debug_set_default_threshold(GST_LEVEL_DEBUG) in the bug report?

      6. Yes I will provide a debug log of all buggy features and the mediafiles but first I have to complete my app.
        I now finished my playpack pipeline but I also want to record audio and video which will already played. For this reason I add an appsink paralell to the audio and video output sinks. Then I create a second pipeline with two appsrc’s, an audio and video encoder, a muxer and a filesink. Recording audio will work very fine but when I try to record video and audio or video alone it often chrashes and the outputfile has a wrong duration time. So if I record for 10 seconds and it does not chrash the file has about 2 minutes of duration but stopps playing after 5 seconds??????
        Everytime the playback pipeline is running and the appsink gives me data I send them to the recorder by calling my “transmit” function.
        I have uploaded my recorder here: http://www.filedropper.com/gstreamerrecorder
        Can you tell my if there is something wrong?

      7. I don’t have time to look at this myself currently, please ask on the GStreamer mailing list or if you think it is a bug in GStreamer please report a bug in Bugzilla with a testcase to reproduce it.

  12. First of all thanks for your code! 🙂
    I managed to run android tutorial 5 on android studio with gradle and it works fine as long as I use the stream from tutorial. If i change it to my rtsp stream from a camera (mpeg4 video) it doesn’t play and state is always PAUSED (as shown by UI). Here is a log:

    0:18:09.449432373 0x77433200 src/main/jni/tutorial-5.c:104: set_ui_message Setting message to: State changed to PAUSED
    0:18:09.450225830 0x77266100 src/main/jni/tutorial-5.c:184:execute_seek Seeking to 0:00:00.000000000
    0:18:09.451049805 0x77266100 src/main/jni/tutorial-5.c:457:gst_native_play Setting state to PLAYING
    0:18:09.458435059 0x77433200 src/main/jni/tutorial-5.c:138:refresh_ui Could not query current duration (normal for still pictures)

    and this last log line repeats many times a second.
    Maybe you have any suggestions on how to deal with it?

    The only changes I’ve made is I changed URI of video stream and added $(GSTREAMER_PLUGINS_CODECS_RESTRICTED) to PLUGINS in Android.mk file.

    1. The code currently assumes non-live sources. You probably have to remove the buffering logic for that to work… but I would need to look closer too, maybe there’s another problem too.

      Does that stream work on a normal machine with GStreamer?

    2. Hi Jaroslav,

      Can you please share how did you get tut-5 compiling successfully with android studio (i guess ndk build by hand)?

      Thanks!

      1. Thanks Jaroslav, that looks good! 🙂 Note that the GStreamer.java is autogenerated by ndk-build though, having it inside GIT is not ideal as it might change when you update to a newer GStreamer version and based on the settings in Android.mk

      2. Which version of the NDK are you using? This looks like for some reason the libraries are not passed to the linker correctly.

      3. Thanks Jaroslav. I got it running with few modifications for making it work under windows (app\build.gradle — commandLine “$ndkDir/ndk-build.cmd” and some try/error with ‘\’ or ‘\\’ or ‘/’). Unfortunately my hope for easy playback for v4l2 quickly shattered. Will post again if I have some progress but i doubt it with my current android development knowledge.

      4. Unfortunately you can’t use v4l2 from a normal Android app. There is a useable GStreamer source for Android though, I will send you a link to the code later.

        This code is hopefully going to be merged into gstreamer soon.

  13. I’m a newbie with streaming media, so I have only checked it on VLC media player (on MAC) and it works. It also shows that there are two audio and two video streams, but as it is said in documentations, playbin2 should automatically detect which one to play.

  14. I have downloaded, built and executed GStreamer tutorial 5 on a BQ Edison device
    (Android version 4.4.2) but when tutorial5 is run I observe the following behavior:

    – the buffering % display progressively reaches 100%, then “Buffering complete” but the default video is not displayed
    – If I move the seek control bar it shows 00:00:00/00:00:00 as if nothing had been really buffered.
    – clicking the play button has no effect (no video is shown)

    Do you have any idea about what could be the problem?
    Thanks in advance for your help

  15. Thanks for your great work, but I found that there are still some small amendment that need to be make to make those Android tutorials working. I put it down in my blog . Hope it will help others.

  16. I’m trying to make Android Tutorial 3 to work. And I run into 2 problems:

    1. I’m using ADT. Though there is no error when I build the project. I found that the Eclipse indexer cannot resolve the symbol “guintptr”. I searched in all the header files in NDK and Gstreamer, but I can’t find where it was defined. It suppose to be inside glib.h.

    2. I get the error “W/GStreamer+glimagesink(1310): 0:00:02.398190717 0xb7cf1d80 gstglimagesink.c:458:_ensure_gl_setup: error: glGetString error: 0x500” as I started the app (yes I replaced autovideosink with glimagesink, but autovideosink gives same error anyway). How can I overcome this?

    Thanks a lot.

    1. 1) You have to tell your IDE where to find the GStreamer headers. I never used Eclipse so don’t know where to do that, but it probably has something where you can tell it about header search paths. That should point to the GStreamer’s include directory.

      2) Please report a bug about that at https://bugzilla.gnome.org against GStreamer, with details about which GStreamer version you use and a full debug log with gl*:6 🙂

  17. Hi,
    I have some problems about used gstreamer on android.
    I am now using gstreamer on android, and my develop step as follows:
    1. Build gstreamer sdk for android using cerbero, and successfully creat gstreamer-1.0-android-arm-1.5.0.zip after execute command “cerbero -c cross-android.cbc package gstreamer-1.0”.
    2. Used ndk-build to build gstreamer SDK to libgstreamer_android.so .The android.mk is:
    GSTREAMER_SDK_ROOT := /home/qys/cerbero/gstreamer-1.0-android-arm-1.5.0
    GSTREAMER_ROOT := /home/qys/cerbero/gstreamer-1.0-android-arm-1.5.0
    GSTREAMER_NDK_BUILD_PATH := $(GSTREAMER_SDK_ROOT)/share/gst-android/ndk-build
    include $(GSTREAMER_NDK_BUILD_PATH)/plugins.mk
    GSTREAMER_PLUGINS := $(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_PLAYBACK) $(GSTREAMER_PLUGINS_CODECS) $(GSTREAMER_PLUGINS_NET) $(GSTREAMER_PLUGINS_SYS) $(GSTREAMER_PLUGINS_CODECS_RESTRICTED) audiotestsrc mad
    GSTREAMER_PLUGINS := coreelements audiotestsrc mad opensles
    G_IO_MODULES := gnutls
    GSTREAMER_EXTRA_DEPS := gstreamer-1.0
    include $(GSTREAMER_NDK_BUILD_PATH)/gstreamer-1.0.mk
    3. I wrote a simple application used c language to play mp3 file, and used ndk-build to build a executed file “gst-mp3-test”, and build OK.
    4. After I push the file “gst-mp3-test”and “libgstreamer_android.so” to the directory “/system/lib”, when I executed the file “./gst-mp3-test”.I could not creat the the element source sink such as:
    source = gst_element_factory_make(“audiotestsrc”,”file-source”);
    decoder = gst_element_factory_make(“mad”,”mad-decoder”);
    sink = gst_element_factory_make(“openslessink”,”audio-output”);
    I saw the Android.mk, when I make libgstreamer_android.so I have add “GSTREAMER_PLUGINS”, but why I could not use the plugins and element?
    How could I do , if I want to use the plugin and element?

    Thank you very much for your help.

    1. libgstreamer_android.so is containing GStreamer, all dependencies and static plugins. You have to initialize it with the generated GStreamer.java, it’s not supposed to be used like you do (but you can make that work too if you want, look at what GStreamer.java and the related JNI bits are doing). Take a look at the examples/tutorial code here to see how things are supposed to be used: http://cgit.freedesktop.org/~slomo/gst-sdk-tutorials/tree/gst-sdk/tutorials/android-tutorial-1

      In general you can’t just put command line applications on an Android phone and use them, unless your phone is rooted. That’s why the default way of doing things with GStreamer on Android is to write a proper Android application.

      1. Hi Slomo,
        Thank you very much for you help.
        I still have 2 problems:
        1. The examples/tutorial code is a guide of how to develop a application. But I want to develop player middleware to take the place of stagefright or nuplayer in android. I have to create pipeline with any elements or plugins that I need to add.
        What is the step ?Is it as follows:
        a). Gstreamer.init(); (It is java interface, If the C/C++ what should I use?)
        b). pipeline = gst_parse_launch(“audiotestsrc ! audioconvert ! audioresample ! autoaudiosink”, &error);
        2. I saw the plugin.mk as below, which in the directory “share/gst-android/ndk-build”, but could not find any sink elements. Which sink should I use?
        GSTREAMER_PLUGINS_CORE := coreelements adder app audioconvert audiorate audioresample audiotestsrc gio pango typefindfunctions videoconvert videorate videoscale videotestsrc volume autodetect videofilter
        GSTREAMER_PLUGINS_CAPTURE := camerabin
        GSTREAMER_PLUGINS_CODECS_RESTRICTED := asfmux dtsdec faad mpegpsdemux mpegpsmux mpegtsdemux mpegtsmux voaacenc a52dec amrnb amrwbdec asf dvdsub dvdlpcmdec mad mpeg2dec xingmux realmedia x264 lame libav
        GSTREAMER_PLUGINS_ENCODING := encoding
        GSTREAMER_PLUGINS_CODECS_GPL := assrender
        GSTREAMER_PLUGINS_NET_RESTRICTED := mms rtmp
        GSTREAMER_PLUGINS_SYS := opensles opengl
        GSTREAMER_PLUGINS_VIS := libvisual goom goom2k1 audiovisualizers
        GSTREAMER_PLUGINS_PLAYBACK := playback
        GSTREAMER_PLUGINS_EFFECTS := alpha alphacolor audiofx cairo cutter debug deinterlace dtmf effectv equalizer gdkpixbuf imagefreeze interleave level multifile replaygain shapewipe smpte spectrum videobox videocrop videomixer accurip aiff audiofxbad autoconvert bayer coloreffects debugutilsbad fieldanalysis freeverb frei0r gaudieffects geometrictransform inter interlace ivtc liveadder rawparse removesilence segmentclip smooth speed videofiltersbad audiomixer compositor
        GSTREAMER_PLUGINS_CODECS := subparse ogg theora vorbis ivorbisdec alaw apetag audioparsers auparse avi dv flac flv flxdec icydemux id3demux isomp4 jpeg matroska mulaw multipart png speex taglib vpx wavenc wavpack wavparse y4menc adpcmdec adpcmenc dashdemux dvbsuboverlay dvdspu fragmented id3tag kate midi mxf openh264 opus pcapparse pnm rfbsrc schro gstsiren smoothstreaming subenc videoparsersbad y4mdec jpegformat gdp rsvg openjpeg androidmedia
        GSTREAMER_PLUGINS_NET := tcp rtsp rtp rtpmanager soup udp dataurisrc sdp srtp

        Thanks a lot.

    1. Those lists are plugins, not elements. “playback” is the plugin that contains playbin and decodebin, for Android the sinks are in “opensles” (audio) and “opengl” (video).

      1. Hi,
        I want to create a pipeline used playbin, (like tutorial-5):
        data->pipeline = gst_parse_launch(“playbin”, &error);
        Why could not find the element playbin?
        Is it any method to use dynamically linking plugin?
        Thanks a lot.

      2. Does it work for you if you just build tutorial-5? Are you statically linking the playback plugin (or $GSTREAMER_PLUGINS_PLAYBACK) into your application?

        It’s possible to dynamically link plugins, just like on any other platform (except iOS). You just have to tell the registry to load the .so, e.g. with gst_plugin_load_file() and gst_registry_add_plugin() or gst_registry_scan_path().

  18. Hi Slomo,
    1. I built and used tutorial-5, it could not work, and have an error :
    Unable to build pipeline:no element “playbin”.
    In my app , I used $GSTREAMER_PLUGINS_PLAYBACK to add plugins to libgstreamer_android.so.
    2. I used dynamically linking plugins, if I need to build libgstreamer_android.so? I only need to use the .so files in directory “/cerbero/dist/android_arm/lib”.Is it OK?
    Thanks.

    1. Hi,
      I think I find the cause of the problem.
      When I call the interface Gstreamer.init(),->nativeInit(context)->gst_native_init().
      Because nativeInit() in Gstreamer.java have parameter context, but JNI method gst_native_init() don’t have parameter :
      static JNINativeMethod native_methods[] = {
      { “nativeInit”, “()V”, (void *) gst_native_init},
      };
      So it will have error too:
      01-01 16:00:08.420 D/AndroidRuntime( 3282): Shutting down VM
      01-01 15:57:32.720 E/AndroidRuntime( 3020): FATAL EXCEPTION: main
      01-01 15:57:32.720 E/AndroidRuntime( 3020): java.lang.UnsatisfiedLinkError: Native method not found: com.gstreamer.GStreamer.nativeInit:(Landroid/content/Context;)V
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at com.gstreamer.GStreamer.nativeInit(Native Method)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at com.gstreamer.GStreamer.init(GStreamer.java:16)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at com.gst_sdk_tutorials.tutorial_5.Tutorial5.onCreate(Tutorial5.java:64)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at android.app.Activity.performCreate(Activity.java:5133)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1087)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2175)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2261)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at android.app.ActivityThread.access$600(ActivityThread.java:141)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1256)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at android.os.Handler.dispatchMessage(Handler.java:99)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at android.os.Looper.loop(Looper.java:137)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at android.app.ActivityThread.main(ActivityThread.java:5103)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at java.lang.reflect.Method.invokeNative(Native Method)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at java.lang.reflect.Method.invoke(Method.java:525)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:773)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:589)
      01-01 15:57:32.720 E/AndroidRuntime( 3020): at dalvik.system.NativeStart.main(Native Method)

  19. Hi slomo,
    I have other problem, that I think you could kindly explain to me.
    If I used dynamicly link gstreamer plugins, I could set environment GST_PLUGIN_PATH, GST_PLUGIN_SCANNER, GST_REGISTRY to .so path.
    And I used ndk-build static gstreamer plugins and create libgstreamer_android.so.
    If I write app used java, I have to used GStreamer.init() to init gstreamer environment.
    But If I only used libgstreamer_android.so, and write executed file used c/c++ code, I also need to set environment, but what path will the environment GST_PLUGIN_PATH, GST_PLUGIN_SCANNER and GST_REGISTRY point to?
    Could you teach me?
    Thank you very much.

      1. Thank you very much for your quickly reply.
        Yes , I saw the file gstreamer_android-1.0.c.in.
        But I still have the problem.
        What the environment “TMP, TEMP, TMPDIR, XDG_RUNTIME_DIR….” are used for?
        I think the environment “GST_REGISTRY, GST_PLUGIN_PATH, GST_PLUGIN_SCANNER” are very useful. And which path the GST_PLUGIN_PATH, GST_PLUGIN_SCANNER should set ,as we don’t have the directory contains plugins .so?

      2. See where cache_dir and files_dir come from in the code I linked. You need to set them to paths inside your application directory on the device, where the .so files are actually placed.
        TMP, TEMP, TMPDIR, etc are used for other parts inside GStreamer, GLib and other software to have e.g. a location where temporary files can be stored.

        But as I told you a few times, you’re not supposed to use GStreamer with the shared libraries on Android and if you do you’re basically on your own and have to replicate all the work (and more) that was done for the static linking approach.

  20. Hi slomo,
    Thank you very much for your help.
    Yes, now we used gstreamer with static libraries.
    I set the environment according to what you said yesterday, but when I execute my app I could not have error “no element playbin”.
    The code as below in my app:
    gchar *cache_dir = “/data/data/user/cache”;
    gchar *files_dir = “/data/data/user/files”;
    gchar *registry;
    GError *error = NULL;

    g_print (“GStreamer initialization Start!!\n”);

    if (gst_is_initialized ()) {
    g_print (“GStreamer already initialized!!\n”);
    return;
    }

    if (cache_dir) {
    g_setenv (“TMP”, cache_dir, TRUE);
    g_setenv (“TEMP”, cache_dir, TRUE);
    g_setenv (“TMPDIR”, cache_dir, TRUE);
    g_setenv (“XDG_RUNTIME_DIR”, cache_dir, TRUE);
    g_setenv (“XDG_CACHE_HOME”, cache_dir, TRUE);
    registry = g_build_filename (cache_dir, “registry.bin”, NULL);
    g_setenv (“GST_REGISTRY”, registry, TRUE);
    g_free (registry);
    setenv (“GST_REUSE_PLUGIN_SCANNER”, “no”, TRUE);
    /* TODO: Should probably also set GST_PLUGIN_SCANNER and GST_PLUGIN_SYSTEM_PATH */
    }

    if (files_dir) {
    gchar *fontconfig, *certs;

    g_setenv (“HOME”, files_dir, TRUE);
    g_setenv (“XDG_DATA_DIRS”, files_dir, TRUE);
    g_setenv (“XDG_CONFIG_DIRS”, files_dir, TRUE);
    g_setenv (“XDG_CONFIG_HOME”, files_dir, TRUE);
    g_setenv (“XDG_DATA_HOME”, files_dir, TRUE);
    fontconfig = g_build_filename (files_dir, “fontconfig”, NULL);
    g_setenv (“FONTCONFIG_PATH”, fontconfig, TRUE);
    g_free (fontconfig);

    certs = g_build_filename (files_dir, “ssl”, “certs”, “ca-certificates.crt”, NULL);
    g_setenv (“CA_CERTIFICATES”, certs, TRUE);
    g_free (certs);
    }

    if (!gst_init_check (NULL, NULL, &error)) {
    g_print (“GStreamer initialization failed!!\n”);
    return;
    }
    Is anything error?Could you help me to see it ?Thank you very much.

    1. Well, you have to set GST_PLUGIN_PATH and GST_PLUGIN_SCANNER. And /data/data/user/files is probably not the directory, you can get the correct directions from android.context.Context and they are specific to your application.

      1. Hi,
        I think you misunderstood what I said.
        I did not write a android project to create a apk application, instead of I wrote a c project to create a executed file used ndk-build. So, I don’t have directions of android.context.Context. I copied the folder fontconfig and ssl to the directory /data/data/user/files.
        If the environment GST_PLUGIN_PATH and GST_PLUGIN_SCANNER set to the path of libgstreamer_android.so is right?
        And if the function gst_android_register_static_plugins() and gst_android_load_gio_modules() will to be implemented?

      2. As said, when going that route you’re on your own and will have to understand how things work in general on Android and inside GStreamer. I don’t have the time to explain you all these things in detail, and this is not a support forum here.

  21. Hello Slomo,

    Thank you for your work.

    I installed the tutos for iOS. After adding the framework VideoToolbox.framework, they work except for the tuto 3 (where “pipeline = gst_parse_launch(“videotestsrc ! warptv ! videoconvert ! autovideosink”, &error)). The state changes to READY, but then when we hit play, nothing happens. Do you have an idea what it could be ?

    That tuto 3 is the closest from what I am trying to accomplish. I am trying to read via this command (which works on my computer) : pipeline = gst_parse_launch(“udpsrc port=5702 ! application/x-rtp, payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! autovideosink”, &error);

    But not being able to have the tuto 3 working, I don’t know how to read that stream.

    Thank you very much,

    Jonathan

    1. This is unmodified tutorial 3? Which GStreamer version are you using?

      For the second pipeline, the RTP caps are incomplete. You have to give at least the media type and encoding name there. See the sink pad template caps in gst-inspect-1.0 on rtph264depay.

      1. It is an unmodified tuto 3, yes, cloned from your above git.
        It is with the version 1.5.1 (latest I think) of the sdk, downloaded from there : http://gstreamer.freedesktop.org/data/pkg/ios/1.5.1/

        Thank you for your comment about rtph264depay, I will check it. But I assumed it was correct as it works on my desktop with that same command. (and the server launching : gst-launch-1.0 videotestsrc horizontal-speed=5 ! x264enc tune=”zerolatency” threads=1 ! rtph264pay config-interval=2 ! udpsink port=5702 host=[ip client]).

        But anyway I cannot have any pipeline working on ios via the tuto 3. I can only have the tutos 4 and 5 with “playbin” working. But that stream is not read with “playbin” (error : cannot determine the type of stream).

        Thanks a lot.

        Jonathan

      2. Indeed, I just tried with 1.4.5 and the tuto 3 works. So I’ll file a bug as you asked.

      3. I am trying to build a version of GStreamer for arm64 with Cerbero, following what you explain here : https://coaxion.net/blog/2014/09/gstreamer-with-hardware-video-codecs-on-ios/
        But if I take the git master version, I have that above problem where I cannot read the stream. Is there a working version that I can build for arm64 ? If I take the 1.4.5 and try to build it with cerbero (with “universal_archs=[Architecture.ARM64, Architecture.X86_64]”), it tells me that “AttributeError: class Architecture has no attribute ‘ARM64′”.
        Which version should I take and how could I build it for ARM64 ?

        Thank you very much.

      4. You have to build GIT master (or 1.5.1), earlier versions don’t have ARM64 support. What “cannot read the stream” problem are you talking about?

      5. I am talking about the bug I filed and we discussed just above. If I use tuto 3, and launch the pipeline via “pipeline = gst_parse_launch(“videotestsrc ! warptv ! videoconvert ! autovideosink”, &error);”, or any other pipeline, it does not work with v1.5.1.
        Because of this bug, I use v1.4.5, where it works, but then I cannot build for arm64.

      6. Ah, plain RTP does not work with playbin, you’ll have to make it work with a custom pipeline. And for a custom pipeline, apparently glimagesink is currently broken on iOS.

        Do I understand you correct that glimagesink works with playbin?

  22. Hey slomo,

    i started using the GStreamer Api on iOS. After a long time i was able to Compile an old Example of the 3rd Tutorial with API 1.5.2. I want to show a Flashvideo-Stream on my iPhone, captured by a external Network-Cam. The status changed from Ready to Playing but nothing will happen on the UIView! What do i wrong! On the Debug site of Xcode i see a network usage of 200kb/s incoming!

    Btw:
    pipeline = gst_parse_launch(“playbin uri=http://192.168.160.206/videostream.flv?user=admin&pwd=”, &error);
    I changed the eglglessink to glimagesink in gst_ios_init.h

    Thx for help

    Chris

      1. Hey slomo,

        thx for the Bug Report, now everything works! 🙂

        With best regards
        Chris

      2. PS: using a Stream with .asf Format do not work. Networkstream is okay, but no Picture is shown!! Is this a Plugin Problem??

      3. Please file a bug with a testcase (including the problematic files/streams) at http://bugzilla.gnome.org against GStreamer. Either you’re missing a plugin, or there’s a bug somewhere. Impossible to say without further information.

  23. Hey,

    i found the problem! The .asf Stream format is corrupted!!! Now it will be very difficulty to get this to run! 🙁

    Is it possible to give GStreamer a pointer to a byte-Stream, that holds a fixed asf stream??, or can i only use the “playbin” function??

    1. Or is it possible to say, which codecs are used in the asf container? It Adpcm for audio and MJPEG for video!!!

      I didn’t found any documentation about the pipeline arguments!

      1. Hey,

        with the Debug-Output i found, that the ASF Demuxer don’t like Streams without Timestamps!!

        Output:
        asf_payload_queue_for_stream: Got payload for stream 2 ts:0:00:00.000000000
        asf_payload_queue_for_stream: Got payload for stream 1 ts:0:00:00.000000000
        …. this has no ending!!

        Header with used Codecs(MJPEG,ADPCM) are shown and correct!

        Is it possible to disable that timestamp-sync???

      2. Set the sync property in the audio/video sinks to FALSE. But such a stream would play as fast as possible then, which is probably also not what you want.

      1. With ffmpeg there is no Problem and vlc just Show me the first Frame, because it has the same Problems with sync A&V!

  24. Hey slomo,

    i created debug logs for streaming a .asf file from server and the stream. What i see, is that the souphttpsrc will send EOS and then the pads will be activated! There are only three Events (SEGMENT,EOS,FLUSH) that will be handled in the asfdemux code! I looked for Buffering the stream, but this also not working. Is it possible that the Demuxer is not able so Stream directly from http-Source? If i am wrong, then i need a complete working pipeline from you or from somebody here in the community! Streaming from file locally and via Server works, so the container is not corrupt!!

    Thx for help!

    Wbr
    Chris

  25. Hi slomo,

    one question! I’ve putted the Tutorial-3 example into another Project and the Video will not be shown! Audio works perfectly and there is no error in Eclipse shown! Btw: the surfaceholder.callback will not call the surfaceChanged Method, what could be wrong?

    Thanks for help!

    Chris

  26. Hey,

    i found the problem! I use async callback i which i create the Player. Meanwhile the SurfaceView will call the onChange Method, so the Player overriden onChange Method will never be called, because there is no instance of it :-D. Creating SurfaceView by runtime was the solution!

    1. You have to change the architectures in the XCode project settings. If you use the 1.5.90 binaries, you can set it to armv7, armv7s, arm64 and i386

      1. Yes, I am using 1.5.90 binaries.

        I have changed the “Valid Architectures” value to, “armv7, i386, arm64, armv7s”, but no luck, same issue. I am trying to run the code in iPhone 6 simulator.

        If I try to run on iPhone 5 simulator, then I get 145 issues. Almost all of them, “Apple Match-O Linker Error…”

        Undefined symbols for architecture i386:
        “_OBJC_CLASS_$_UIApplication”, referenced from:
        objc-class-ref in VideoViewController.o
        “_OBJC_CLASS_$_UIResponder”, referenced from:
        _OBJC_CLASS_$_AppDelegate in AppDelegate.o
        “_OBJC_CLASS_$_UITableViewController”, referenced from:
        _OBJC_CLASS_$_LibraryViewController in LibraryViewController.o
        “_OBJC_CLASS_$_UIView”, referenced from:
        _OBJC_CLASS_$_EaglUIView in EaglUIVIew.o
        “_OBJC_CLASS_$_UIViewController”, referenced from:
        _OBJC_CLASS_$_VideoViewController in VideoViewController.o
        “_OBJC_METACLASS_$_UIResponder”, referenced from:
        _OBJC_METACLASS_$_AppDelegate in AppDelegate.o

      2. Add x86_64 to the architectures, then you will probably get the same linker errors as for i386. Those linker errors look like your project file is broken

  27. I am getting the same error for iOS. Just for experiment I am trying to run projects for MacOS-X. I have installed 1.5.90 for Mac and ran “gst-inspect-1.0” from Terminal. I got the following response

    ~$ gst-inspect-1.0
    dyld: Library not loaded: /Users/jan/cerbero/dist/darwin_x86_64/lib/libgstreamer-1.0.0.dylib
    Referenced from: /Library/Frameworks/GStreamer.framework/Commands/gst-inspect-1.0
    Reason: image not found
    Trace/BPT trap: 5

    I am wondering about the mentioned path, “/Users/jan/…”.
    Can you please tell me how to resolve the issue?

    Thanks in advance.

  28. Hi,

    I have problems rendering some videos.
    My renderer is based on the app-sink and uses this gst caps:
    gst_caps_new_simple (“video/x-raw”, “format”, G_TYPE_STRING, “NV21”, “framerate”, GST_TYPE_FRACTION, 30, 1, “pixel-aspect-ratio”, GST_TYPE_FRACTION, 1, 1, NULL);
    The renderer itself is nearly the same as this one:
    http://stackoverflow.com/questions/22456884/how-to-render-androids-yuv-nv21-camera-image-on-the-background-in-libgdx-with-o

    I have uploaded a dot file from a working video, a dot file from the not working video and a screenshot of the not working video.

    working: http://s000.tinyupload.com/?file_id=05838761008867899389
    not working: http://s000.tinyupload.com/?file_id=04852192890748924713
    screenshot: http://s000.tinyupload.com/?file_id=03951151209806881058

    I hope there is a solution.

    Thanks Thomas

    1. This looks like a stride problem, and also your caps don’t contain the width and height, which are both required fields. Check the GstVideoInfo API to get the expected strides, plane offsets, etc.. You have to ensure that this matches, otherwise you will get what you see in the best case or crashes in other cases.

      1. Thanks for your answer. I’m now using the stride and offset. The video is now a bit better but the color is on the wrong position.

        The image is 330 width and 248 height.
        The first stride values is 332 and the second stride value is also 332
        The first offset value is 0 and the second offset value is 82336

        For this reason my y_buffer goes from byte 0 to byte 82336 and my uv_buffer goes from 82336 to byte 123504.

        I use this 2 buffers with the glTexImage2D function.

        Is there a misstake:
        glTexImage2D(GL20.GL_TEXTURE_2D, 0, GL20.GL_LUMINANCE, y_stride, image_height, 0, GL20.GL_LUMINANCE, GL20.GL_UNSIGNED_BYTE, yBuffer);

        glTexImage2D(GL20.GL_TEXTURE_2D, 0, GL20.GL_LUMINANCE_ALPHA, uv_stride / 2, image_height / 2, 0, GL20.GL_LUMINANCE_ALPHA, GL20.GL_UNSIGNED_BYTE, uvBuffer);

        New video image: http://s000.tinyupload.com/?file_id=76905627627567262478

        I hope you can help me again.

        Thanks Thomas

      2. Why are you uploading things yourself to GL? Just use libgstgl for that, or at least take a look at its code to get these things correct 🙂
        I don’t see any obvious mistake here though, might be something in your shader or you’re not actually providing the data correctly in the buffer according to the width/height/strides you mentioned.

      3. I have now tried to use an other video decoder.
        For default the “amcviddec-omxsecavcdec” decoder was used on my Galaxy S3. I have set the priority of this decoder to none and now my phone uses the “amcvideodec-omxgoogleh264decoder”.

        With this decoder everything looks perfect.
        I also made bad experience with other video files that uses an androidmedia decoder.

        Is it normal that sometimes decoders from androidmedia doesn’t work correctly or are this bugs that have to be fixed?

        What should I do if I found such a problem with a decoder. I don’t want to change the priority everytime I have a problem.

      4. Please report a bug with informations about the device, the codecs used, the stream that exposes the problem and a GStreamer debug log at https://bugzilla.gnome.org against GStreamer/gst-plugins-bad. It might be a bug in the GStreamer element, or in the Android codec. We’ll have to check.

  29. Hey slomo,

    i have downlaoded the latest Version of GStreamer(1.5.90) and setup my GSTREAMER_ROOT_ANDROID environmental Variable to that folder, but there are some problems with the linker! Its for Android developing

    Example…

    C:/Daten/Developing/libs/android-ndk-r9/toolchains/arm-linux-androideabi-4.6/prebuilt/windows-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld.gold.exe: warning: cannot scan executable section 1 of C:/Daten/Developing/libs/gstreamer-1.0-android-armv7-1.5.90/lib/libx264.a(pixel-a.o) for Cortex-A8 erratum because it has no mapping symbols.
    C:/Daten/Developing/libs/android-ndk-r9/toolchains/arm-linux-androideabi-4.6/prebuilt/windows-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld.gold.exe: warning: cannot scan executable section 1 of C:/Daten/Developing/libs/gstreamer-1.0-android-armv7-1.5.90/lib/libx264.a(mc-a.o) for Cortex-A8 erratum because it has no mapping symbols.

    C:/Daten/Developing/libs/android-ndk-r9/toolchains/arm-linux-androideabi-4.6/prebuilt/windows-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld.gold.exe: C:/Daten/Developing/libs/gstreamer-1.0-android-armv7-1.5.90/lib/libgraphene-1.0.a(libgraphene_1_0_la-graphene-quaternion.o): in function graphene_quaternion_init_from_angles:graphene-private.h:96: error: undefined reference to ‘sincosf’
    C:/Daten/Developing/libs/android-ndk-r9/toolchains/arm-linux-androideabi-4.6/prebuilt/windows-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld.gold.exe: C:/Daten/Developing/libs/gstreamer-1.0-android-armv7-1.5.90/lib/libgraphene-1.0.a(libgraphene_1_0_la-graphene-quaternion.o): in function graphene_quaternion_init_from_angles:graphene-private.h:96: error: undefined reference to ‘sincosf’

      1. Hey, found the Problem! Just using the NDK-r10e Builder and it works. But something more… GStreamer should work on an Android 2.3.6 OS. I’ve tried the Tutorial3 on an Galaxy S2(2.3.6) and a Xperia ST21i(4.0.4) both will not load the libgstreamer-android.so! I tried just loading the CORE alone into the .so file, but this will not work too! Maybe the lib is too big to load or something. Is there some Special Flag to set for compiling or are these Smartphones just to weak???

      2. It is supposed to work on those old Android versions, yes. Which version of GStreamer did you use? Check “adb logcat” for the exact error why it fails to load the libgstreamer_android.so.

  30. I use the latest 1.5.90 Version. This Version runs on an Galaxy S3 Smartphone or newer! Logcat just says that it tries to load the .so file, but the Screen stays black an nothing more happens! I used try/catch block, but no exception will be thrown.

      1. Yeah, it stucks at the System.loadlibrary(Line 554). What is strange that when i only load the PLAYBACK plugin, the Lib will be loaded, so there is maybe something wrong with the CORE plugin. I looked for getting a native backtrace, but in this case i am a beginner. I dont know how.

      2. Does it already lock up when only using the CORE and PLAYBACK plugins? Or only if you also add some of the CODEC or SYSTEM ones?

  31. I’ve tested it with (CORE,PLAYBACK,SYS,NET,CODECS,CODECS_RESTRICTED) -> not loading
    (PLAYBACK,SYS,NET,CODECS,CODECS_RESTRICTED) -> not loading
    (PLAYBACK) -> loading

    I’ve tested it also on Android 4.0.3 (HTC One, HTC Wildfire) -> not loading

    Maybe it is a OS problem

    1. I’m expecting this to be related to the androidmedia plugin. Can you change gstreamer_android-1.0.c (in share/gst-android/ndk-build) to have gst_debug_set_threshold_from_string(“2,amc*:6”, TRUE); after the gst_debug_set_default_threshold() call, then rebuild your application, run it and get the complete output on “adb logcat”? Also please report this on https://bugzilla.gnome.org against GStreamer.

  32. Hi,

    I have problems using the appsrc and the 1.5.x versions of the android binary.
    I’m using the playbin, “appsrc://” as uri and the “source-setup” callback to configure the appsrc element but when I try to start the pipeline with the 1.5.x version I get this error message: “Error received from element typefind: Could not determine type of stream.”
    The caps are generated with this code snippet:
    gst_audio_info_set_format (&info, GST_AUDIO_FORMAT_S16, 44100, 2, NULL);
    audio_caps = gst_audio_info_to_caps (&info);

    Here are some screenshots from the saved dot files.
    With 1.4.5: http://s000.tinyupload.com/?file_id=03142116902310035816
    With 1.5.2 and 1.5.90: http://s000.tinyupload.com/?file_id=21633680002178747919

    Please give me some help.
    Thanks

      1. Thanks for that info 🙂

        I have now build my own gstreamer library with cerbero to get the last fixes but I have some other problems :/

        What is the preffered way to seek with appsrc?
        I think I have to wait while seek “gst_element_seek (pipeline, 1.0, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH, GST_SEEK_TYPE_SET, desired_position, GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE);” is working but when do this with “gst_element_get_state (pipeline, NULL, NULL, 10 * GST_SECOND);” I have to send a new buffer because of the asynchronous behaviour.
        When I send buffers while seeking without waiting I get some different problems. When I lock the buffer pushing while seeking the “gst_element_get_state (pipeline, NULL, NULL, 10 * GST_SECOND);” function waits for 10 seconds but after the 10 seconds it resumes playing without any problem. Therefore I think that I have to wait to prevent my problems but I didn’t know how.

        Can you give me an simple example for seeking and pushing buffers within different threads and locking. Maybe with pthread locks.

        Thanks

        Thanks

      2. I have made many test now and I figured out that it seems to be a thread problem. The main problem is that after seeking the current position will not update anymore and also the EOS callback will not called after calling “gst_app_src_end_of_stream(GST_APP_SRC(audio_source))”. I have made the tests with the 1.4.5 binary and also with a self compiled version.

        This doesn’t work:
        //The seek function (will be called be a slidebar)
        pthread_mutex_lock(&seekMutex);
        seeking=true;
        pthread_mutex_unlock(&seekMutex);
        bool ret = gst_element_seek_simple (this->pipeline, GST_FORMAT_TIME, (GstSeekFlags)(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT), desired_position);
        pthread_mutex_lock(&seekMutex);
        seeking=false;
        pthread_mutex_unlock(&seekMutex);

        //The appsrc “seek-data” callback
        pthread_mutex_lock(&player->audioTimestampMutex);
        player->audiotimestamp=offset;
        pthread_mutex_unlock(&player->audioTimestampMutex);

        //The push data function called by a thread which reads a file
        //needAudioData will be set to true be the “need-data” callback and false be the “enough-data” callback
        bool isSeeking=false;
        pthread_mutex_lock(&seekMutex);
        isSeeking=seeking;
        pthread_mutex_unlock(&seekMutex);

        if((currentState >= GST_STATE_PAUSED || pendingState >= GST_STATE_PAUSED) && needAudioData && !isSeeking)
        {
        GstBuffer *buffer;
        GstFlowReturn ret;
        buffer = gst_buffer_new_and_alloc (buffersize);

        pthread_mutex_lock(&audioTimestampMutex);
        GST_BUFFER_PTS(buffer) = audiotimestamp;
        GST_BUFFER_DURATION(buffer) = getFrameTime(audio_caps, gst_buffer_get_size(buffer));
        audiotimestamp += GST_BUFFER_DURATION(buffer);
        pthread_mutex_unlock(&audioTimestampMutex);

        gst_buffer_fill(buffer, 0, data, buffersize);
        ret = gst_app_src_push_buffer(GST_APP_SRC(audio_source), buffer);

        This works but is not very pretty:
        bool ret = gst_element_seek_simple (this->pipeline, GST_FORMAT_TIME, (GstSeekFlags)(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT), desired_position);
        sleep(1);
        pthread_mutex_lock(&seekMutex);
        seeking=false;
        pthread_mutex_unlock(&seekMutex);

        Simply waiting a second after the seek solves the problems but isn’t there a better solution?

        Please give me some help.
        Thanks

      3. Please ask on the mailing list, this is not a support forum and your question is also completely unrelated to the original topic.

  33. Hey,

    is it possible to use GStreamer twice at the same time.
    I have two Activities, but i see the Video of the first Activity on the second Activity too. (Tabbar with two tabs)

    Thanks in advance.

      1. Okay,

        but how can i release the first surface?? I call nativeSurfaceFinalize and nativeFinalize, but the Stream Image still covers the new Video Image!

        Does it make a Problem if i call GStreamer.init() two times with two different loaded libraries (Tutorial 3 and Tutorial 5)??

      2. No, you can call it multiple times. For the surface handling, that’s normal Android stuff. You need to manage them in a way that they are working the way you want.

  34. Okay thx,

    i found a way, but just one little question more.
    Does GStreamer Support gaps between two mpeg2-ts files in an HLS Stream? I have a working Version with playbin, but when the Player reaches the gap, there will be no jump to the next file, which is 15min. (+900000 ms) in the future!

    Do i need to set an flag or something like that?

  35. Hi Slomo!
    I see you said that you ported Android and iOS examples for the Visual Studio, but I can’t find them in the gst-sdk-tutorials. Where can I find them or how I can use these examples in the Visual Studio?
    Regards!

    1. You can’t compile iOS or Android apps in MSVC. What is available for MSVC is the source code of the non-mobile tutorials for Windows.

      1. And can I build shared library on C in Eclipse, and after that use P/invoke from C# Xamarin code in VS?

    1. You should be able to use the GStreamer binaries for that, and also the gstreamer-sharp bindings. It probably needs some custom build system stuff to work properly (don’t ask me what, I would have to try myself), but I know that someone did that in the past already.

      1. Ok, thanks!
        I successfully built modified “GStreamer 1.0 ‘tutorial 5’ for Android Studio”(https://github.com/jaroslavas/Gstreamer-Android-example) and got 2 .so libraries: libtutorial-5.so and libgstreamer_android.so. Now I trying to use them from the Xamarin.Android app using DllImport but output says that: “dlopen failed: could not load library ‘libgstreamer_android.so’ needed by ‘libtutorial-5.so’; caused by library ‘libgstreamer_android.so’ not found’.”.
        So app can’t found libgstreamer_android.so, but of course I added these two libraries in the my project(I successfully imported some other libraries, so I know how exactly do this). What am I doing wrong?
        And one more thing: I haven’t used JNI in the C code(just pure C) because it will be necessary to use this library from the C#(Xamarin.Android), not from the Java, but I understand that I can’t make it so easy. So what is the easiest and best way to use native Gstreamer C code(if it’s possible without JNI) via C# Xamarin.Android (besides gstreamer-sharp)?
        Thank you in advance!
        Regards!

      2. That’s the manual hacks I’m talking about, you need to make sure that libraries are found by setting the correct dynamic linker library paths, etc. If you don’t want to use any Java, on Android you’ll have to call gst_android_init(), see here: http://cgit.freedesktop.org/gstreamer/cerbero/tree/data/ndk-build/gstreamer_android-1.0.c.in#n417

        It still requires JNI objects though. Then JNIEnv and the android.context.Context object of your application. You should be able to get those via Xamarin as it’s in general required to communicate with other Android APIs.

      3. And by the way, what operating systems are supported by gstreamer-sharp(because I found many different information about this)?

      4. Linux, BSD, Windows, OS X, iOS and Android should all work. The main problem is going to be that building for everything but Linux/BSD/OSX is going to be tricky and involves manual hacks.

      5. Ok, and one more question: if I want to build gstreamer-sharp-1.0(not 0.10) for Linux, Mac OS X, Android and iOS, then where can I find examples which describe how I can do this? Or can I just compile gstreamer-sharp-1.0 .dll’s and use them via Mono(for Linux and Mac OS X) and Xamarin(for Android and iOS)?

      6. There’s no documentation about that as far as I know. You’ll have to understand how these things work and then do it yourself, and ideally write it down somewhere for posterity 🙂

        The .dlls should be reusable, the tricky part is the glue library with native code and integrating all this into Xamarin’s and GStreamer’s build systems.

      7. And exactly which part of code I will need to change on my own? And how difficult it will be?

      8. Many thanks for your support! I already successfully built gstreamer-sharp on Windows and now I trying to build it for Linux and Android. I understand how I will try to use gstreamer-sharp on Linux. But how can I start preparation of gstreamer-sharp for Android? Because I can’t find even any tutorials or any mentions about “gstreamer-sharp on Android”. So what do I need to do this?

      9. Just try it and see where you run into problems. First of all, try building a native GStreamer application for Android to understand how the build system works… and then you’ll have to integrate that somehow into the Xamarin build process.

      10. Sorry, I forgot to say, of course I meant use gstreamer-sharp via Xamarin.Android.

      11. Finally, I decided to use just native Gstreamer(not Gstreamer-sharp). As I said before I built native Gstreamer and ran it using Java code, but Xamarin.Android project said that ‘dlopen failed: could not load library “build/obj/local/armeabi-v7a/libgstreamer_android.so” needed by “libtutorial-5.so”‘. I understood that: this error occurs because libtutorial-5.so depends on the libgstreamer_android.so: libtutorial-5.so trying to find libgstreamer_android.so library in “build/obj/local/armeabi-v7a/libgstreamer_android.so” location. But of course these two libraries are located in the lib/armeabi-v7a(in Xamarin project) directory and even if I replace all two libraries or just libgstreamer_android.so in this location, then I still have this error.
        So how can I change path of the libgstreamer_android.so library (on which my first library will depend) in the Android Studio or just in the .mk files(on the .so building stage)? You can check this full question on the GStreamer-devel: http://gstreamer-devel.966125.n4.nabble.com/Gstreamer-on-Android-Issue-with-the-dependent-library-td4675705.html.

        And how do you think, which Gstreamer scheme of use is better: | Xamarin.Android C# project -> Gstreamer C library -> Gstreamer SDK | OR | Xamarin.Android C# project -> Java intermediate library -> Gstreamer C library -> Gstreamer SDK |? By the way, what is libgstreamer_android.so? Is it all what I need to use Gstreamer or it’s just a Gstreamer core?
        And the last questions about Gstreamer iOS: is the Objective-C the best way to use Gstreamer on iOS?
        Sorry for the so many questions, but I really can’t find answers on them.
        Regards!

  36. Hey,

    is it possible to turn the Surface of the Videooutput to black in native code?? I’ve tried it by SurfaceView in Android with lockCanvas and Stuff, but everytime i will have probs with Thread logging! The black Screen would be great between change time of Stream or Videofile!

      1. Either when the Video paused or i just call a Function where this mechanism is implemented! I just want to know if it is possible and where i have to look for?

      2. The Surface is owned by GStreamer unless you completely stop the pipeline, so what you can do is to hide it in the UI or put something else in the foreground. Or you have to write your own sink integration and do the rendering of the GL textures manually, in which case you can just render in any way you want.

  37. At the moment i put a black view in front of the Output, but when i make this view invisible too late, the Output Screen will still be black. Only when i turn the Smartphone from Portrait to landscape, the Screen will “re”-render the Video Output correctly! Okay, so i have too look for a Workaround in Android, because i dont want to implement an own renderer!

  38. Hi,

    how can i build android-mips with cerbero??

    Is the instruction on your site using cerbero an old one??

    Last update of the site is 2013!

  39. Hey slomo,

    i try to use GStreamer with HLS Stream with SSL Encryption.
    I load the gnutls module into the iOS Example and it works all for iPhone 6 and newer. But everything older than iPhone 6 (iPhone 4s,5,5c,5s) will not work. I have this assertion from debugger:

    Assertion failed: (nbytes size * sizeof (mp_limb_t)), function _nettle_ecc_modq_random, file ecc-random.c, line 66.

    I think there is a problem because of 32/64 Bit Processor Architecture! Do you know something about it?

    I use GStreamer v1.6!

  40. Hi,

    i want to stream camera flashvideo over network/internet with followed pipeline:

    souphttpsrc location=urlToVideo/videostream.flv ! decodebin name=dec dec. ! autovideosink dec. ! volume volume=0 ! autoaudiosink async-handling=true

    The problem is that the stream has a latency of almost over 8 seconds in direct network.

    Is is possible to speed up this to 2 seconds or lower?

    Thx for help!

    Best regards John

    1. Possible, yes. But first you’ll have to measure where the latency is exactly introduced, and how much everywhere. There’s sender side latency, network buffering, receiver side buffering and latency introduced by the receiver (decoders, ringbuffers in audiosink). Each of these can be tuned and depending on your exact use case and requirements you can get down to less than a second over HTTP.

  41. Hi Slomo,
    Thanks for sharing your work. I am trying to build the tutorials and failed to build the ndk libraries with following error. It’s a linker error which I can’t seem to fix. Have you or anyone else seen this ? Any help will be appreciated.

    ~/work/android-experiments/gst-sdk-tutorials/gst-sdk/tutorials/android-tutorial-1 (master) 96b0ce2$ ndk-build
    Android NDK: WARNING: APP_PLATFORM android-23 is larger than android:minSdkVersion 9 in ./AndroidManifest.xml
    GStreamer : [GEN] => gst-build-armeabi-v7a/gstreamer_android.c
    GStreamer : [COMPILE] => gst-build-armeabi-v7a/gstreamer_android.c
    GStreamer : [LINK] => gst-build-armeabi-v7a/libgstreamer_android.so
    /home/slomo/Projects/android/android-ndk-r10e/platforms/android-9/arch-arm/usr/include/signal.h:113: error: undefined reference to ‘bsd_signal’
    /home/slomo/Projects/android/android-ndk-r10e/platforms/android-9/arch-arm/usr/include/signal.h:113: error: undefined reference to ‘bsd_signal’
    /home/slomo/Projects/android/android-ndk-r10e/platforms/android-9/arch-arm/usr/include/signal.h:113: error: undefined reference to ‘bsd_signal’
    /home/slomo/Projects/android/android-ndk-r10e/platforms/android-9/arch-arm/usr/include/signal.h:113: error: undefined reference to ‘bsd_signal’
    collect2: error: ld returned 1 exit status
    make: *** [buildsharedlibrary_armeabi-v7a] Error 1

    Thanks,
    Purnendu

    1. This looks like a problem in your NDK installation, it doesn’t find symbols of things that are exposed by the NDK headers.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.