Using the Aravis GStreamer Source Plug-In

Using the Aravis GStreamer source we can stream images and utilize the many powerful pre-built GStreamer elements to rapidly prototype and construct high performance imaging pipelines for a wide variety of applications. 

Installing GStreamer on Linux

In terminal, enter the following command:

$ sudo apt-get install libgstreamer1.0-0 gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav gstreamer1.0-doc gstreamer1.0-tools gstreamer1.0-x gstreamer1.0-alsa gstreamer1.0-gl gstreamer1.0-gtk3 gstreamer1.0-qt5 gstreamer1.0-pulseaudio

Verify that the installation was successful:

$ gst-launch-1.0 videotestsrc ! videoconvert ! xvimagesink

You should now see a test pattern.

If GStreamer did not install properly, consult:

Installing Aravis 0.8 on Linux

Next download and install Aravis which is an open source GenICam image acquisition library.

The project's GitHub repository:

Download and extract the latest stable release from:

Install the meson build system from:

One option is to use pip:

$ python3 -m pip install meson

Install the Ninja build system (

$ sudo apt-get install ninja-build

Install some dependencies:

$ sudo apt-get install libgstreamer-plugins-base1.0-dev libnotify-dev intltool build-essential libgtk-3-dev gtk-doc-tools  libusb-1.0-0-dev libxml2-dev 

Navigate to the aravis package download location, eg:

$ cd aravis-0.8.2

Follow the instructions from the Aravis Readme file (

$ meson build
$ cd build
$ ninja
$ sudo ninja install

Then refresh shared library cache:

$ sudo ldconfig

If there are issues then installing missing dependencies may be required:

Note the location of where the GStreamer plugin is installed. This is part of the output of the "sudo ninja install" command. For example on arm64 machines the line of interest is something like:
Installing gst/ to /usr/local/lib/aarch64-linux-gnu/gstreamer-1.0

Add this path to the an environment variable as follows to add it to the path that GStreamer loads plugins from:

$ export GST_PLUGIN_PATH=$GST_PLUGIN_PATH:/usr/local/lib/aarch64-linux-gnu/gstreamer-1.0

This path is different on different architectures.

This variable does not persist across terminal sessions so it is a good idea to add setting it to a script that is run automatically like ~/.profile or ~/.bashrc

Verify that GStreamer can see the plugin with:

$ gst-inspect-1.0 aravissrc

If you are using a USB3 camera:

  1. Add Aravis rules to your udev rules: 
  2. Download: aravis.rules from
  3. Move this file to: /etc/udev/rules.d/
  4. Restart your computer to ensure the rules are refreshed
  5. Once restarted re-export the GST_PLUGIN_PATH variable
  6. Try to stream images from your camera using the Aravis viewer:
$ arv-viewer-0.8

If the camera is visible in the arv-viewer and the aravissrc element is found by the "gst-inspect-1.0 aravissrc" command then try a simple GStreamer pipeline:

$ gst-launch-1.0 aravissrc ! videoconvert ! xvimagesink -v

This should open a window and display a live stream from the camera

To end the stream enter ctl-C in the terminal to send an EOS (End of Stream) signal to the pipeline.

If you are having any issues, please consult:

Example GStreamer pipelines using the Aravis Source 

You can specify settings such as width, height and frame rate:

$ gst-launch-1.0 aravissrc ! video/x-raw, width=960, height=720, framerate=10/1 ! videoconvert ! xvimagesink

You can specify the image pixel format (RGB is a color format):

$ gst-launch-1.0 aravissrc ! video/x-raw, width=960, height=720, framerate=10/1, format=RGB ! videoconvert ! xvimagesink

You can add filters to your pipeline to perform operations on the image stream. Flip the images for example:

$ gst-launch-1.0 aravissrc ! video/x-raw, width=960, height=720, framerate=10/1 ! videoconvert ! videoflip video-direction=180 ! videoconvert ! xvimagesink

Example GStreamer pipelines using the Aravis Source and Nvidia hardware acceleration

You can take advantage of Nvidia hardware acceleration by constructing pipelines according to:

Here is an example using Nvidia hardware and a color FLIR camera for streaming a video, encoding it using Nvidia accelerated h264 encoding, muxing it into a mp4 container, and saving it to a file:

$ gst-launch-1.0 aravissrc ! video/x-raw, format=RGB, width=960, height=720, framerate=10/1 !  videoconvert ! omxh264enc ! video/x-h264, streamformat=byte-stream ! h264parse ! qtmux ! filesink location=testvid.mp4 -e

For a software encoder alternative to the omxh264enc hardware encoder, one option is x264enc:

Here is how you can play the video you just recorded using the GStreamer playbin utility:

$ gst-launch-1.0 playbin uri=file://<path to video>/testvid.mp4

Live stream video over a network using RTP

Using a GStreamer pipeline we can stream video over a network (wired or wireless).

Server Side Example

This is the host machine controlling the camera.

$ gst-launch-1.0 aravissrc ! video/x-raw, width=960, height=720, framerate=10/1 ! videoconvert ! x264enc ! rtph264pay pt=96 name=pay0 config-interval=1 ! udpsink host=<ip address of the client> port=5555 -e

Client Side Example

This is the machine where we want to display the live stream.

$ gst-launch-1.0 udpsrc port=5555 caps='application/x-rtp' ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! xvimagesink sync=false

For more information about using GStreamer please visit: