GStreamer with hardware video codecs on iOS

Update: GIT master of cerbero should compile fine with XCode 6 for x86/x86-64 (simulator) too now

In the last few days I spent some time on getting GStreamer to compile properly with the XCode 6 preview release (which is since today available as a stable release), and make sure everything still works with iOS 8. This should be the case now with GIT master of cerbero.

So much for the boring part. But more important, iOS 8 finally makes the VideoToolbox API available as public API. This allows us to use the hardware video decoders and encoders directly, and opens lots of new possibilities for GStreamer usage on iOS. Before iOS 8 it was only possible to directly decode local files with the hardware decoders via the AVAssetReader API, which of course only allows rather constrained GStreamer usage.

We already had elements (for OS X) using the VideoToolbox API in the applemedia plugin in gst-plugins-bad, so I tried making them work on iOS too. This required quite a few changes, and in the end I rewrote big parts of the encoder element (which should also make it work better on OS X btw). But with GIT master of GStreamer you can now directly use the hardware codecs on iOS 8 by using the vtdec decoder element or the vtenc_h264 encoder element. There’s still a lot of potential for improvements but it’s working.

Notes

If you compile everything from GIT master, it should still be possible to use the same application binary with iOS 7 and earlier versions. Just make sure to use “-weak_framework VideoToolbox” for linking your application instead of “-framework VideoToolbox”. On earlier versions you just won’t be able to use the hardware codecs.

Currently compiling cerbero GIT master for iOS x86 and x86-64 will fail in libffi. Only the ARM variants work. So don’t build with “./cerbero-uninstalled -c config/cross-ios-universal.cbc” but the “cross-ios-arm7.cbc”. And if you need to run bootstrap first, run it from the 1.4 branch for now and then switch back to the master branch. I’m working on fixing that next week.

98 thoughts on “GStreamer with hardware video codecs on iOS”

  1. Very cool news!
    I was trying to get gstreamer working with appsink and AVSampleBufferDisplayLayer for the last two days. But it did not work, as expected. The Code is running and the AVSampleBufferDisplayLayer Status is rendering. The only thing what is missing, is that i can see the output. Tried a lot with kCMSampleAttachmentKey_DisplayImmediately but did not get it.

    A very nice intro to me was this .

    I don’t know, whether it is useful what i have done. But maybe gstreamer-support of AVSampleBufferDisplayLayer is nice, because it is relatively easy to take the output of an rtph264depayer and get it in a CMSampleBuffer for displaying with AVSampleBufferDisplayLayer.

      1. That’s probably not going to give you much success 🙂 Best place for such things would be the GStreamer IRC channel (#gstreamer) on Freenode

  2. Nice. Apple claims facetime will also work with h.265/hevc on the iphone 6. Is this codec also available with this API?

    1. I only tested h264. What happens when you use mpeg2? Can you get a debug log, and report a bug in Bugzilla with those details?
      Is this on OSX or iOS?

  3. Hi there, thanks for this post and your great leadership in GStreamer.

    Our aim is simply to use the vtenc and vtdec on iOS since we’re getting high CPU use by using the regular h264enc. We’re completely new to GStreamer build process. Please let us know if we’re doing anything wrong here and how to get to the vtenc and vtdec binaries: we’re trying to build from master (http://cgit.freedesktop.org/gstreamer/cerbero/tree/) the latest code, generally following the following guide: http://docs.gstreamer.com/display/GstSDK/Building+from+source+using+Cerbero in Yosemite for ARM64.

    Is there any other guide elsewhere to build the latest master using Cerbero or we should follow this one itself?

    BTW, is there any currently planned ETA for 1.4.4?

    1. 1.4.4 is planned some time this week or next week, we’ll see. But that won’t contain ARM64 support yet probably, not sure if I want to backport all those changes to 1.4 as they are not small.

      For the build process, just check the README file inside cerbero. The main difference to what that website says is that you run “package gstreamer-1.0” instead of “package gstreamer-sdk” now.

      1. Thanks Sebastian!

        Even though ARM64 is not ready for 1.4.4, we can build ARM64 ourselves doing Cerbero build, right?

        We’ll post in the development mailing list if we face any issue with Cerbero build process.

      2. Yes, if you build from GIT master of cerbero you can build ARM64 binaries. Update to today’s latest version (I just changed something) and then run “./cerbero-uninstalled -c config/cross-ios-universal.cbc package gstreamer-1.0” to get binaries for all ARM architectures.

      3. Hi Sebastian, is there any reference URL for the vtenc and vtdec parameters? Like will key-int-max=90 and tune=zerolatency work like it works for x264enc encoder?

      4. No, currently vtdec has no properties at all and vtenc_h264 has only a bitrate parameter. We should add some more though, VideoToolbox has more parameters for the encoder.

    2. Hi Sebastian!

      We were able to use the non-amd64 version of vtenc_h264. However, there’s a lot of block artifacts in the resulting video. Examining the code of vtenc.c we saw that it is using Baseline 2.1 profile by default.

      Is there any way to use higher profiles? Our use case is streaming iPhone camera capture to RTMP (nginx-rtmp). Shouldn’t we use minimum Extended or Main 3.1 profile? How to set that with vtenc?

      Here’s our current pipeline:

      avfvideosrc ! video/x-raw,format=NV12,width=1280,height=720,framerate=30/1 ! tee name=tp tp. ! queue ! autovideosink tp. queue ! videoconvert ! vtenc_h264 bitrate=6500 ! queue ! mux. autoaudiosrc ! audioconvert ! audiorate ! voaacenc bitrate=64000 ! queue ! mux. flvmux streamable=true name=mux ! queue ! rtmpsink location=’rtmp://ww1.example.com:1935/live/ios-cam-testing-009 live=1′

      1. There are ways to choose higher profiles but that’s currently not exposed as property on vtenc_h264. It’s relatively easy to add though. There are also many other parameters that would be nice to expose as properties on the stream.

      2. Thanks, Sebastian.

        To achieve our objective, can we rebuild the SDK after making just the following changes to the vtenc.c code (or do we need to change anything else anywhere)?

        1. Change the baseline 2.1 profile setting in this line to kVTProfileLevel_H264_Extended_AutoLevel:
        http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/applemedia/vtenc.c#n558

        2. (?) We also need to change the max keyframe interval duration by following Apple HLS best practices so that there’s one keyframe every 3 seconds. For that, should we change the following line?
        http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/applemedia/vtenc.c#n569
        to:
        gst_vtenc_session_configure_max_keyframe_interval_duration (self, session, 3);

        3. For software based encoder x264enc, we use tune=zerolatency. In vtenc’s case, is there any equivalent setting(s) we can use or using the “Extended” profile itself will be sufficient for live streaming use case?

        Thanks a lot for your help, Sebastian!

      3. Disable frame reordering, which would be disabled by default with baseline profile. Other than that you’ll have to experiment with the different settings 🙂

        If you want to expose more property on vtenc_h264, patches would be great to have 🙂 Everything except for profile/level would be done via GObject properties, profile/level would be negotiated via caps.

      4. Thanks, Sebastian! Really excited about this… seems all the properties we needed for live streaming use case is now exposed.

      5. Hi Sebastian, after many iterations/trial-errors we’re getting closer to make things work on iOS with less and less issues. However, there are still few issues outstanding and I wanted to bring to your attention the following issue that happens most of the time when the pipeline is being stopped. We’re streaming either a rtspsrc or avfvideosrc out to two rtmpsink’s. The app crashes upon stopping the pipeline and crash log shows the following:

        Crash due to signal: SIGSEGV(SEGV_ACCERR) at 0000007c{
        0 Trace 0x00aa6326 _mh_execute_header + 11150118
        1 Trace 0x00aa2220 _mh_execute_header + 11133472
        2 Trace 0x00abdbc0 _mh_execute_header + 11246528
        3 Trace 0x00b3bc50 _mh_execute_header + 11762768
        4 Trace 0x00b3bdf8 _mh_execute_header + 11763192
        5 Trace 0x00b6e490 _mh_execute_header + 11969680
        6 Trace 0x00b3b660 _mh_execute_header + 11761248
        7 Trace 0x003eacd0 _mh_execute_header + 4091088
        8 Trace 0x009e6d78 _mh_execute_header + 10366328
        9 libsystem_pthread.dylib 0x34ab5e90 + 136
        }

      6. Sorry, Sebastian, I wanted to send you the following log while the previous crash log also happens some times. The most consistently generated crash log is:

        Crash due to signal: SIGSEGV(SEGV_ACCERR) at 00000034{
        0 Trace 0x00159d60 _mh_execute_header + 1400160
        1 VideoToolbox 0x265743a0 + 264
        2 libdispatch.dylib 0x304fd7b8 + 8
        3 libdispatch.dylib 0x305055b0 + 952
        4 libdispatch.dylib 0x304fff80 + 80
        5 libdispatch.dylib 0x30506b98 + 336
        6 libdispatch.dylib 0x30507cd0 + 88
        }

      7. Hi Sebastian,

        I have another general question. The arm64 architecture is not there in the official build (but gets generated if custom build is done). So, for iPhone 6 and 5s, which architecture should be used?

      8. All arm7 variants will work on them. arm64 will be first included officially with the 1.5/1.6 release series.

      9. Thanks for the information, Sebastian!

        We’ll post further consistent crashes within GStreamer to the bug tracker. Thanks again.

  4. Hi there, thanks for this post.

    I have build the Cerbero and followed by GStreamer using the following commands:
    ./cerbero-uninstalled -c config/cross-ios-universal.cbc bootstrap
    ./cerbero-uninstalled -c config/cross-ios-universal.cbc package gstreamer-1.0

    I was expecting a GStreamer.framework file somewhere inside the build output. Could you please let me know how to get the GStreamer.framework file from build output? Needless to say, I am completely new to this GStreamer build process.

      1. Thanks Slomo.

        I got the .pkg file and added the GStreamer.framework to my XCode project. After building I got the following error,

        Lexical or Preprocessor issue. arm64/glibconfig.h file not found.

        I don’t have any clue where I did something wrong. Could please guide me how to resolve the issue?

        screenshot:
        http://prntscr.com/538wzd

      2. Thanks Slomo for your instant response.

        Yes, the file glibconfig.h does exist is ios_universal directory, though not in arm64 subfolder, but these subfolders:

        cerbero ▸ dist ▸ ios_universal ▸ armv7s ▸ lib ▸ glib-2.0 ▸ include ▸ arm
        cerbero ▸ dist ▸ ios_universal ▸ armv7s ▸ lib ▸ glib-2.0 ▸ include
        cerbero ▸ dist ▸ ios_universal ▸ armv7 ▸ lib ▸ glib-2.0 ▸ include ▸ arm
        cerbero ▸ dist ▸ ios_universal ▸ armv7 ▸ lib ▸ glib-2.0 ▸ include
        cerbero ▸ dist ▸ ios_universal ▸ lib ▸ glib-2.0 ▸ include ▸ arm
        cerbero ▸ dist ▸ ios_universal ▸ lib ▸ glib-2.0 ▸ include ▸ i386
        cerbero ▸ dist ▸ ios_universal ▸ x86 ▸ lib ▸ glib-2.0 ▸ include ▸ i386
        cerbero ▸ dist ▸ ios_universal ▸ lib ▸ glib-2.0 ▸ include
        cerbero ▸ dist ▸ ios_universal ▸ x86 ▸ lib ▸ glib-2.0 ▸ include

        is there any chance if I redo the whole build process or particularly build only, cross-ios-arm64.cbc instead of cross-ios-universal.cbc, the issue might get resolved?

      3. In those directories I also have an arm64 variant here. What’s the cerbero commit you’re using (top commit in “git log”)? Also if you rebuild it, does it appear? In theory your arm64 build should’ve failed very early if that file did not exist.

        If you only build with the arm64 config, you won’t have binaries for any other architecture and it can only run on arm64 devices. Otherwise that’s possible, yes.

  5. Hi Sebastian,

    I’ve a problem with VT encoder on iOS. GStreamer version is 1.4.5

    This pipeline works fine on Maverics
    gst-launch-1.0 -v videotestsrc num-buffers=1000 ! vtenc_264 bitrate=512 ! h264parse ! qtmux ! filesink location=1.mov
    But not on the iOS where is create the pipeline from the app based on one of your Tutorial iOS apps.
    The code looks like this:
    NSString *cl = [NSString stringWithFormat:@”videotestsrc num-buffers=1000 ! vtenc_h264 ! h264parse ! qtmux ! filesink location=%@”, tempFile];
    pipeline = gst_parse_launch([cl UTF8String], &error);

    and i have callbacks assigned just for the error and eos messages. I’m seeing VT encoding is progressing:

    0:00:25.193202000 [334m 3423[00m 0x38aa068 [36mINFO [00m [00m vtenc vtenc.c:1020:gst_vtenc_update_latency:[00m latency status 0 frames 1 fps 30/1 time 0:00:00.033333333

    But eventually it stops with no EOS message. CPU drops to 0%, output file is malformed and this is the last message written to Xcode console:
    0:00:29.925321000 [334m 3423[00m 0x38aa098 [36mINFO [00m [00m basesrc gstbasesrc.c:2724:void gst_base_src_loop(GstPad *):[00m pausing after gst_base_src_get_range() = eos
    0:00:29.925486000 [334m 3423[00m 0x38aa098 [36mINFO [00m [00m task gsttask.c:300:void gst_task_func(GstTask *):[00m Task going to paused

    iOS pipeline works fine if i change to x264enc element.
    I’m pretty new to GStreamer stuff so not really sure where to dig next. If you could give me the direction?

    Thanks,
    Denis

    1. That might be this bug, which will be fixed with 1.4.6 and 1.5.1. If you get a backtrace of all threads when that happens I can tell you more.

      1. Thanks Sebastian. I’ve got a snapshot of running threads in Xcode and that’s exactly the same locking issue in gst_vtenc_finish()

  6. Hi Christian,
    I’ve found an issue in the vtenc.c when the ‘quality’ setting doesn’t get set for the VTCompressionSession object because self->session is not yet set within the gst_vtenc_create_session(). The fix is to replace

    gst_vtenc_set_quality (self, self->quality);

    with

    gst_vtenc_session_configure_property_double (self, session,
    kVTCompressionPropertyKey_Quality, self->quality);

    which will also remove redundant lock on self if gst_vtenc_set_quality is called.

    I’m trying to re-build locally plugins-bad with this patch but ‘buildone’ option seemingly forces fetch so my local change gets overwritten. How can i omit fetch phase?

    Thanks,
    Denis

  7. Hi Christian,
    Though not directly related to the topic but this is critical question to me – is the ‘avvideoscale’ element not included in the 1.4.5 build? I’ve checked for OS X and IOS distros and neither recognizes the element name though i’m seeing the plugin code in libav code tree. The reason i’m interested in ‘avvideoscale’ is that standard’s ‘videoscale’ performance on iOS is quite low (not applies to OS X which is working swiftly) – i.e. it gives 5x increase in re-encoding time with my mov file sample (encode 1920×1080 mov to 2Mbps bitrate and 640×480 resolution).
    Thanks,
    Denis

    1. avvideoscale is disabled in 1.x because it’s not properly ported yet. Can you report a bug, and also check if ORC is used by videoscale on your device (or if it falls back to the backup C functions)?

      1. Thanks Christian for quick reply. About ORC – how can i check that it’s used? Is liborc statically linked in GStreamer framework?

      2. (Sebastian, not Christian 🙂 )

        It is, and you can check if it works by setting ORC_DEBUG=6 in the environment. For example with g_setenv(). This has to happen before you create your pipeline, and will then cause lots of stuff to be printed on the terminal. Just provide that output 🙂

      3. Sorry Sebastian 🙂 end of long day 🙂

        Here’s the ORC output in log. Init stage:

        ORC: INFO: orcdebug.c(70): void _orc_debug_init()(): orc-0.4.23.1 debug init
        ORC: INFO: orcprogram-neon.c(129): void orc_neon_init()(): marking neon backend non-executable

        and then there’s continuous warnings like this one during the pipeline execution:

        ORC: WARNING: orccompiler.c(392): OrcCompileResult orc_program_compile_full(OrcProgram *, OrcTarget *, unsigned int)(): program orc_combine4_12xn_u8 failed to compile, reason: Compilation disabled, using emulation

      4. I forgot that i’m using locally compiled gstreamer. I’ll try with official 1.4.5 and update here.

      5. Yep. I see same problem with 1.4.5 official build – orc compiler fails and switches to emulation mode.

      6. No. There’s nothing more specific about why NEON is disabled but tracing with debugger shows that orc_arm_get_cpu_flags in orccpu-arm.c has practically no executable code for IOS (one branch is linux only and one Android-only) and will always return 0 (which means no NEON support). I think NEON should definitely be supported on armv7 but i’ll double check on more recent iPad generation. Looks like a bug in orc to me though.

      7. Could you file a bug report with this information?

        One possibility would be that we can’t get executable memory on iOS, which wouldn’t be too surprising…

      8. I’ve applied a quick hack locally to the orc_arm_get_cpu_flags to return ORC_TARGET_NEON_NEON flag so that now orc compiler is not disabled. I’m now seeing noticeable videoscale speed up on the iPad mini device. However on my initial iPad 3rd gen i’m getting segfaults from different gstreamer->orc bridges like video_orc_chroma_up_v2_u8 (videoscale plugin), video_test_src_orc_splat_u32 (videotestsrc) etc.
        per hw info iPad 3rd get uses A5x chip while iPad mini (1st gen) uses A5 so that’s really strange why orc works ok with 2nd but not with 1st. Should i file this against gstreamer or against orc directly?

      9. Hi Sebastian and Denis,

        We’re also using videoscale to generate 360p version of the video from a 540p video capture of avfvideosrc and videoscale has caused loss of fluidity in both capture preview and streaming in all of iPhone 4x/5x/6.

        We’re also eager to try out Denis’ work around of “orc_arm_get_cpu_flags to return ORC_TARGET_NEON_NEON flag”. Will follow this here and in the bug report.

        Any idea around when avvideoscale might get ported?

  8. Hello Slomo,

    I would like to build GStreamer for iOS with the very latest changes. For example, with this bug fixes
    https://bugzilla.gnome.org/show_bug.cgi?id=744585

    Unfortunately, I am not getting the latest changes when I build GStreamer. I have posted an issue in devel mailing list,

    http://gstreamer-devel.966125.n4.nabble.com/GStreamer-iOS-how-to-get-the-latest-code-in-custom-build-td4670699.html

    Any idea why this might happen? Thank you in advance.

  9. Hello Slomo,

    Can you please shed some light on when can we expect the next official GStreamer-iOS universal build?

    Thank you in advance.

    1. Hopefully very soon, there are just some GL related issues that have to be solved before that and then there should be 1.5.1

      1. Hello Slomo,

        Sorry to bother you again. Can you please provide any tentative date of the next official GStreamer-iOS universal build?

        Thank you in advance.

  10. I have an issue running the tutorials in the simulator. Even just tutorial 3, when I press the play button, gst_element_set_state hangs.

    I have cross compiled the iOS framework from the latest 1.5.0 source. I had to do this in order to get x86_64 to work. The office releases don’t have an x86_64 headers directory and glibconfig.h complains about it.

    I had to add VideoToolbox to the linked libraries but then it built fine, but it doesn’t run. Is there something I’m doing wrong?

    I’m using Yosemite 10.10.3 and Xcode 6.3.1 trying to compile for iOS SDK 8.3 and run in the simulator.

    Thank you for your help!

    Bryan

    1. According to your mail to gstreamer-devel, this is all working fine now? Please let’s continue discussions there then

  11. Hi Sebastian,

    I am developing an iOS app with Gstreamer and just built it for arm64 by using ./cerbero-uninstalled -c config/cross-ios-universal.cbc package gstreamer-1.0

    After installing the package, adding it to my project and when trying to rebuild the app, I got the error:
    duplicate symbol _ff_log2_tab in:
    /Users/liutingdu/Library/Developer/GStreamer/iPhone.sdk/GStreamer.framework/GStreamer(libavcodec_a_arm64_-log2_tab.o)
    /Users/liutingdu/Library/Developer/GStreamer/iPhone.sdk/GStreamer.framework/GStreamer(libavformat_a_arm64_-log2_tab.o)
    duplicate symbol _ff_log2_tab in:
    /Users/liutingdu/Library/Developer/GStreamer/iPhone.sdk/GStreamer.framework/GStreamer(libavcodec_a_arm64_-log2_tab.o)
    /Users/liutingdu/Library/Developer/GStreamer/iPhone.sdk/GStreamer.framework/GStreamer(libavutil_a_arm64_-log2_tab.o)
    duplicate symbol _iconv_locale_charset in:
    /Users/liutingdu/Library/Developer/GStreamer/iPhone.sdk/GStreamer.framework/GStreamer(libcharset_a_arm64_-localcharset.o)
    /Users/liutingdu/Library/Developer/GStreamer/iPhone.sdk/GStreamer.framework/GStreamer(libiconv_a_arm64_-localcharset.o)
    duplicate symbol _hash_pjw_bare in:
    /Users/liutingdu/Library/Developer/GStreamer/iPhone.sdk/GStreamer.framework/GStreamer(libgnutls-openssl_a_arm64_-hash-pjw-bare.o)
    /Users/liutingdu/Library/Developer/GStreamer/iPhone.sdk/GStreamer.framework/GStreamer(libtasn1_a_arm64_-hash-pjw-bare.o)
    duplicate symbol _strverscmp in:
    /Users/liutingdu/Library/Developer/GStreamer/iPhone.sdk/GStreamer.framework/GStreamer(libgnutls-openssl_a_arm64_-strverscmp.o)
    /Users/liutingdu/Library/Developer/GStreamer/iPhone.sdk/GStreamer.framework/GStreamer(libtasn1_a_arm64_-strverscmp.o)
    ld: 5 duplicate symbols for architecture arm64
    clang: error: linker command failed with exit code 1 (use -v to see invocation)
    ld: 5 duplicate symbols for architecture armv7

    Any idea how can I get rid of it?

    Thanks a lot,
    Liuting

      1. Hi Sebastian,

        Many thanks for your reply.

        I rebuilt it for arm64 because when I trying to submit the app to Apple store, it says missing 64-bit support. But after adding 64 bit support it still doesn’t work because of the error.

        Is there any other possible ways just to get the app published? Or is it not possible to publish the app until the bug get fixed?

        Thanks in advance,
        Liuting

      2. You could omit the libav and dtls plugins from your app, then it should link fine. You’ll have to check which plugins to include before submitting it to the appstore anyway, otherwise if you just include everything you might include some plugins that have appstore incompatible licenses.

  12. Thank you for the quick reply! ‘omit the libav and dtls plugins’ – could you please provide more details about how to do this (or even a related link)? I am new to both Gstreamer and iOS…

  13. Hello,
    I try to run pipeline and get an error (gstreamer 1.5.2):
    gldisplay gstgldisplay.c:217:gst_gl_display_new:[00m Could not create display. user specified (NULL) (platform: (NULL)), creating dummy

    in gstreamer <= 1.4.5 – glimagesink works fine

  14. hello,I try to use hardware h264 decoder onandroid,according to mediacodec.xml,the element is amcviddec-omxmtkvideodecoderavc,my code is “videotestsrc ! x264enc ! h264parse config-interval=1 ! amcviddec-omxmtkvideodecoderavc ! videoconvert ! autovideosink”,it failed ,the error is “Error received from element amcvideodec-omxmtkvideodecoderavc:Gstreamer encountered a general supporting libaray error”.
    what should i do?How I can kown the cpas of amcviddec-omxmtkvideodecoderavc.
    Thanks in advance.

  15. Hello,
    I try to run pipeline on ios 8.4.1 and get an error (gstreamer 1.5.9):
    basesink gstbasesink.c:2846:gboolean gst_base_sink_is_too_late(GstBaseSink *, GstMiniObject *, GstClockTime, GstClockTime, GstClockReturn, GstClockTimeDiff, gboolean):[00m warning: There may be a timestamping problem, or this computer is too slow.

    Can you help me?

      1. Hello. I tried to ask help here: http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
        but could not get answer for 6 days.
        i solved the problem with latency. but i still have one with the “invalid colorimetry, using default”.
        may be you can help me?
        the problem comes all time when i use vtdec on ios device or simulator. the sender is does not matter. i try with raspberry, with rtsp link. always get the same result (get greenish image) and warning 0:00:06.509840000 [336m 3527[00m 0x1484da30 [33;01mWARN [00m [00m video-info video-info.c:391:gboolean gst_video_info_from_caps(GstVideoInfo *, const GstCaps *):[00m invalid colorimetry, using default

    1. You have to subscribe to the list to get replies, or explicitly say in your mails that you would like replies CC’d to you so people can manually send mails to you too 🙂

  16. Hi Sebastian,

    Just a quick question – we want to grab 640×360 capture from avfvideosrc but that caps is not supported in iPhone 5/6/5s. We can’t capture 640×480 and crop to 640×360 since none of the videocrop, videobox or aspectratiocrop plugins are available in iOS.

    What could be the solution for our 640×360 capture/video need then? Is there any way to build gstreamer with the missing cropping plugins for iOS?

      1. Many thanks for the pointer, Sebastian! We’ll try that…

        Videoscale was causing additional 60% CPU usage, which should be much lower with crop.

  17. We are trying to capture frame in ios from Gstreamer Pipeline but not able to convert buffer in to CIImage.

    Please find the code below for reference:

    static GstFlowReturn new_sample(GstAppSink *appsink, gpointer data1)

    {

    CMSampleBufferRef *buffer =Nil;

    g_signal_emit_by_name (appsink, “pull-buffer”, &buffer);

    if (buffer) {

    /* The only thing we do in this example is print a * to indicate a received buffer */

    //g g_print (“*”);

    CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(*buffer);

    CIImage* image = [CIImage imageWithCVPixelBuffer:pixelBuffer];

    // NSLog(@”image…%@”,image);

    gst_buffer_unref (buffer);

    }

    return GST_FLOW_OK;

    }

    Pipeline and setting Callabck:

    pipeline = gst_parse_launch(“udpsrc port=5000 caps =\”application/x-rtp, media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264\” ! rtph264depay ! ffdec_h264 ! appsink name=sink sync=false async=false”,&error );

    sink = gst_bin_get_by_name (GST_BIN (pipeline), “sink”);

    gst_app_sink_set_emit_signals((GstAppSink*)sink, true);

    gst_app_sink_set_drop((GstAppSink*)sink, true);

    gst_app_sink_set_max_buffers((GstAppSink*)sink, 1);

    GstAppSinkCallbacks callbacks = { NULL, new_preroll,new_sample };

    gst_app_sink_set_callbacks (GST_APP_SINK(sink), &callbacks, (__bridge gpointer)(self), NULL);

    Please let me know how we can solve this problem.If you have any more process please let me know how we convert buffer to CIImage.

    1. Please ask on the GStreamer Developer’s mailing list here: http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
      You can also find code to convert from GStreamer buffers to CVPixelBuffers here http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/applemedia/vtenc.c#n1046

      Also you’re using GStreamer 0.10, which is no longer supported since more than 3 years now. Please update to a newer version like 1.6. This version also contains out of the box support for the hardware codecs on iOS.

    2. i am getting error when try to creating pipeline in ios as your pipeline code error – no element in “udpsrc” pls suggest what i miss there.

Leave a Reply to Shafqat Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.