Mar 152014
 

By default, osssink will not work on the Raspberry Pi with gstreamer-1.0.

As root, enter the following:

modprobe snd-pcm-oss

And then you should be able to play your audio files, for instance using:

gst-launch-1.0 playbin uri=file:///home/pi/tina.wav

AIFF files can be played with:

gst-launch-1.0 playbin uri=file:///home/pi/tina.aiff

Alternatively this will give you a test tone:

gst-launch-1.0 audiotestsrc ! audioconvert ! audioresample ! autoaudiosink

You can use

alsamixer

to control the volume (using the arrow keys).

Feb 082014
 

Last month we posted a gstreamer pipeline for avi playback and simply presumed a working gstreamer installation. Now we have added a couple of packages to our growing repository, that make it very easy to install.

The packages include the gst-omx module, which allows h264 de- and encoding with full use of the Raspberry Pi’s graphics hardware. But even with hardware support, the current stable series (1.2) doesn’t perform very well, because it does unnecessary copying of the video data. This is why these packages are based on the current development series (1.3), which means that they are not as mature and well tested as the slightly older 1.2 versions. But so far we haven’t run into any problems of that nature.

As a side note: The gstreamer versioning can be a little confusing. The packages you can find in software repositories are usually marked with “-0.10” or “-1.0”. The 1.x series is API and ABI stable and supersedes the previous stable 0.10 series and packages marked 1.0 can contain any version from the 1.x series.

The rest of the article will be about installing the packages, how to see if they work and a couple of things you can try if they don’t.

Installation

Add our repository and install gst-omx-1.0.

The omx plugins need GST_OMX_CONFIG_DIR to be set. You can do that temporarily like this:

export GST_OMX_CONFIG_DIR=/usr/etc/xdg/

or add the same line to your .profile as a permanent solution.

Test

You can use gst-inspect-1.0 to list all installed gstreamer plugins. Combine with grep and it’s easy to find out if gstreamer knows about the omx plugins:

gst-inspect-1.0 | grep omx

should give you something like this:

omx:  omxmpeg2videodec: OpenMAX MPEG2 Video Decoder
omx:  omxmpeg4videodec: OpenMAX MPEG4 Video Decoder
omx:  omxh263dec: OpenMAX H.263 Video Decoder
omx:  omxh264dec: OpenMAX H.264 Video Decoder
omx:  omxmjpegdec: OpenMAX MJPEG Video Decoder
omx:  omxvc1dec: OpenMAX WMV Video Decoder
omx:  omxh264enc: OpenMAX H.264 Video Encoder

To play a file you can use the previously mentioned pipeline:

gst-launch-1.0 filesrc location=test.avi ! avidemux ! h264parse ! omxh264dec ! autovideoconvert ! eglglessink

(Note that gstreamer is not a media player, but a framework that can be used by one. These commands are meant to support developers.)

Troubleshooting

Errors during playback

Make sure the memory split is generous with the video hardware, otherwise you may get seemingly random and unrelated errors during playback.

The omx plugins don’t show up

If gst-inspect-1.0 doesn’t show the plugins, try deleting gstreamers registry:

rm .cache/gstreamer-1.0/registry.armv6l.bin

Pipeline debugging

A very useful feature for debugging pipelines is the possibility to create a visual graph of the pipeline. Set GST_DEBUG_DUMP_DOT_DIR to a temporary directory before running the pipeline:

mkdir tmp
GST_DEBUG_DUMP_DOT_DIR=./tmp/

Gstreamer will dump a lot of debugging information in *.dot files in that directory. You can use dot from the graphviz package to create image files from those:

dot -Tpng x.xx.xx.xxxxxxxxx-gst-launch.FOO.dot > graph.png

Links

  1. gstreamer pipeline for avi playback
  2. list of packages in our repository
  3. gstreamer project website
  4. streaming H.264 via RTP
  5. slightly outdated but useful StackExchange
  6. Raspberry Pi forums
Aug 022013
 

This is a work still in progress with unsatisfactory results (image quality, delay, very low frame rate), but here’s for the brave-hearted and those who are researching into the same direction:

Set up Windows streaming host

This can be a multi-monitor machine. Your left-most monitor will be streamed.

I generally use FullHD resolution for testing.

  • Install a Direct Show Screen Capture Filter for Windows. We used the direct show filter provided with “Screen Capturer Recorder” by Roger D Pack. Roger also includes an audio direct show capturer. And all free of charge – a real bargain 😉
  • Maybe a reboot is necessary here
  • Install latest version of ffmpeg from Zeranoe. Opt for the static builds (probably 64 bit if you are running a modern Windows 64 bit OS on a modern computer)
  • extract the download to a safe location
  • Open PowerShell, and navigate to the location

List the available screen filter devices:

This and all following shell commands are to be issued in the PowerShell. 

.\ffmpeg -list_devices true -f dshow -i dummy

This will show you the available input devices to capture from. My list looks like this, for instance:

 DirectShow video devices
  "Integrated Webcam"
  "screen-capture-recorder"
 DirectShow audio devices
  "Microphone (2- High Definition Audio Device)"
  "virtual-audio-capturer"

Start the stream:

.\ffmpeg -f dshow -i video="screen-capture-recorder" -vcodec libx264 -vprofile baseline -preset ultrafast -tune zerolatency  -pix_fmt yuv420p -b:v 400k -r 30  -threads 4  -fflags nobuffer -f rtp rtp://192.168.1.14:1234

I used PowerShell to start this, thus the .\ is needed in front of an application in the current folder.

  • libx264 is used as video codec, rather than mpeg4 (for superior quality – the Raspi is capable of H264 hardware decoding)
  • baseline profile needs to be used together with –pix_fmt yuv420p – this basically reduces the encoding to a simple subset of the full standard. Leaving out these two options led to the streaming not working, but you may be able to figure out something – please comment!
  • -preset ultrafast and –tune zerolatency both accelerate the video output. I have a latency of about 1 – 2 sec. in our lab here
  • -b:v 400k sets the target bitrate (as variable)
  • -r 30 this sets the framerate to 30
  • -threads 4 – give more threads to ffmpeg
  • -fflags nobuffer – should decrease latency even further. Not sure if it does, though.
  • -f rtp – specifies the output format. Here we use rtp, and stream it directly to the raspberry – which has the IP 192.168.1.14 on our network. You can choose whatever you like for the port, by an odd coincidence we chose 1234. Aliens?!?

Hit “Enter” and ffmpeg will start streaming. It will show you handy statistics – current frame number, framerate, quality, total size, total time, current bitrate, duplicated capture-frames, dropped capture-frames (i.e. the capturing rate does not align with the streaming rate). Do not worry too much about those for now.

Please note that you need some horsepower for capturing, encoding and streaming in real-time.

Set up Raspberry Pi

omxplayer can’t handle RTP streams directly – thus, we resort to GStreamer.

GStreamer 1.0 includes special support for the Raspberry Pi’s Broadcom SoC’s VideoCore IV hardware video functions (also known as OpenMax). Unfortunately, the Raspbian maintainers do not want to include it (yet), in order not to diverge too far from the official Debian repositories.

Luckily for you, though, someone has precompiled the binaries and set up a repository. See this thread for more background information, or simply follow my instructions:

sudo nano /etc/apt/sources.list

This will open nano to edit your package repository list. Please add the following line into this file:

deb http://vontaene.de/raspbian-updates/ . main

After saving the file (Ctrl + O, Ctrl + X), run the following commands:

sudo aptitude update
sudo aptitude install libgstreamer1.0-0-dbg gstreamer1.0-tools libgstreamer-plugins-base1.0-0 gstreamer1.0-plugins-good gstreamer1.0-plugins-bad-dbg gstreamer1.0-omx gstreamer1.0-alsa

This will install the necessary gstreamer1.0 & components.

Start the stream receiver & decoder chain:

gst-launch-1.0 -v udpsrc port=1234 caps='application/x-rtp,payload=(int)96,encoding-name=(string)H264' ! queue ! rtph264depay ! h264parse ! omxh264dec ! autovideosink sync=True

This can be done as user pi. Please note, that this may not be the perfect command to achieve playback, but it is a good starting point – as it works!

Gstreamer sets up “pipelines”, in which data is passed on in transformed state from step to step. While it seems to be quite a bit at the first look, it is very logical in itself, once you have figured it out.

  • we specify a UDP source (udpsrc), the port, and “caps”
  • Without the RTP caps, playback is not possible. Apparently they are not provided along with the stream? Thus, we have to specify the caps manually.
  • In the caps we specify some information for the pipeline
  • queue may be omitted, I am not sure what it does
  • rtph264depay – depayload h264 data from rtp stream
  • h264parse – parse h264 data
  • omxh264dec – decode the data with BroadCom OpenMAX hardware acceleration
  • autovideosink – put the result on the display
  • sync=True – I am not sure whether this does anything, or whether it is in the right place and form. It was an attempt to fix the gst_base_sink_is_too_late problems (but it did NOT fix them).

Issues

slow screen updates

These are very likely caused by a slow screen capture refresh rate, this may be better with a different screen capturer.

On Windows 8, with a pretty powerful Core i7 machine, I get possible fps 15.41 (negotiated for 30 fps). This is using Roger’s / betterlogic’s screen-capture-recorder. Roger claims this is due to Aero.

See more about it here  and here (also provides a list of available other directshow screen capture filters).

artifacts

Gstreamer shows massive H.264 artifacts – Matthias Bock has opened an issue for this, and some further hints.

This seems to be related to the bitrate set in FFMPEG – if I lower it to ~ 400 k, the artifacts become less distorted, and image quality is quite OK. Also, use a variable bitrate instead of a constant one.

gst_base_sink_is_too_late()

This may be related to the Pi’s fake hardware clock (?). It also appears when running gstreamer with a simple test image setup:

gst-launch-1.0 videotestsrc ! autovideosink

gstbasesink.c(2683): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstEglGlesSink:autovideosink0-actual-sink-eglgles:
There may be a timestamping problem, or this computer is too slow.

 

The command above will display a test video image.

Sound

I have not tried sound yet. Sound shoud be input into ffmpeg using the following arguments:

-i audio="virtual-audio-capturer":video="screen-capture-recorder"

This directly from Roger’s GitHUB documentation.

Ideas

  • try to use gstreamer on Windows for streaming?
  • Adjust Parameters for betterlogic/Roger’s direct show capturer
    • apparently it hits the ceiling at 15 fps with Aero on
  • Use a different direct show capturer
  • Tune quality for ffmpeg stream

Background info

  • H.264 is MPEG-4 Part 10 or = MPEG-4 AVC – and is the more modern and data-efficient codec format (“advanced video coding”);
  • whereas MPEG-4 Part 2 = MPEG-4 Visual is based on the older image compression standards used in MPEG-2, and also implemented in DivX, Xvid, etc.
  • you can also use .\ffplay –i udp://:1234 to test the streaming output on the local machine. The video quality IS NOT TO BE USED AS A REFERENCE. It just shows, that it “works”. Change the target IP accordingly (“localhost” instead of the Raspi’s IP will do, I believe.)

References

Optimization WordPress Plugins & Solutions by W3 EDGE