New GStreamer plugins: OpenNI2 (3D sensors, e.g. Kinect), RTP over TCP and OpenEXR (HDR image format)

Over the last few days a few new GStreamer were added to latest GIT, and will be part of the 1.3/1.4 releases.

OpenNI2 – 3D sensors support

Miguel Casas-Sanchez worked on implementing a source element for the OpenNI2 API, which supports the Kinect camera of the XBox and some other cameras that provide depth information as another channel. This is not to be confused with stereoscopic videos as supported by some codecs and used in cinema, which uses two different frames from slightly different angles and creates a 3D image from that.

This source element handles dumps of the camera input to files, and also capturing from cameras directly. Currently the output of it is either RGB (without depth information), 16bit grayscale (only depth information) or RGBA (with depth information in the alpha channel). This can be configured with the “sourcetype” property right now. At a later time we should try to define a proper interface for handling depth information, especially something that does not feel completely contradictory with the stereoscopic video API. Maybe there could be just another plane for the depth information in a GstMeta.

The plugin is available in gst-plugins-bad GIT.

RTP over TCP

One question that was asked often in the past is how to stream RTP over a TCP connection. As RTP packets are expected to be sent over a datagram protocol, like UDP, and TCP provides a stream protocol, it is necessary to collect a complete packet at the receiver side and then pass this onwards. RTP has no size information in the packets, so an additional framing protocol is required on top of RTP. Fortunately there’s an RFC which defines a very simple one that is used in practice: RFC4571. Thanks to Olivier Crête for mentioning this RFC on the gstreamer-devel mailinglist. The framing protocol is very simple, it’s defined as just pre-pending a 16 bit unsigned, big-endian integer in front of each RTP or RTCP packet.

Yesterday I wrote two simple elements for this RFC implemented and integrated them into gst-plugins-good GIT.

They can for example be used like this:

gst-launch-1.0 audiotestsrc ! “audio/x-raw,rate=48000” ! vorbisenc ! rtpvorbispay config-interval=1 ! rtpstreampay ! tcpserversink port=5678

gst-launch-1.0 tcpclientsrc port=5678 host=127.0.0.1 do-timestamp=true ! “application/x-rtp-stream,media=audio,clock-rate=48000,encoding-name=VORBIS” ! rtpstreamdepay ! rtpvorbisdepay ! decodebin ! audioconvert ! audioresample ! autoaudiosink

A more elaborate solution could also use RTCP communication between the sender and receiver. RTCP can also be passed through the rtpstreampay and rtpstreamdepay elements the same way.

For anything more complicated you should consider looking into RTSP though as it is much more flexible, feature-rich and allows exchanging the stream configurations automatically, and it also allows streams to be delivered via TCP (or UDP unicast/multicast). GStreamer has a source plugin and a server library for receiving or serving RTSP streams.

OpenEXR – HDR image formats

Another thing that I worked on (and still do) is an OpenEXR decoder plugin. OpenEXR is a HDR image format, but unfortunately we don’t have support for any HDR-compatible raw video format in GStreamer… yet! The OpenEXR decoder element is inside gst-plugins-bad GIT now but internally converts the 16 bit floating point color components to 16 bit integers, choosing 1.0 as clipping point. Once there’s consensus about how to expose such raw video formats in GStreamer (see Bugzilla #719902), support for the 16 bit floating point RGBA format should be easy to add.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.