Skip to content Skip to sidebar Skip to footer

How To Create Video Thumbnails With Python And Gstreamer

I'd like to create thumbnails for MPEG-4 AVC videos using Gstreamer and Python. Essentially: Open the video file Seek to a certain point in time (e.g. 5 seconds) Grab the frame at

Solution 1:

To elaborate on ensonic's answer, here's an example:

import os
import sys

import gst

defget_frame(path, offset=5, caps=gst.Caps('image/png')):
    pipeline = gst.parse_launch('playbin2')
    pipeline.props.uri = 'file://' + os.path.abspath(path)
    pipeline.props.audio_sink = gst.element_factory_make('fakesink')
    pipeline.props.video_sink = gst.element_factory_make('fakesink')
    pipeline.set_state(gst.STATE_PAUSED)
    # Wait for state change to finish.
    pipeline.get_state()
    assert pipeline.seek_simple(
        gst.FORMAT_TIME, gst.SEEK_FLAG_FLUSH, offset * gst.SECOND)
    # Wait for seek to finish.
    pipeline.get_state()
    buffer = pipeline.emit('convert-frame', caps)
    pipeline.set_state(gst.STATE_NULL)
    return buffer

defmain():
    buf = get_frame(sys.argv[1])

    with file('frame.png', 'w') as fh:
        fh.write(str(buf))

if __name__ == '__main__':
    main()

This generates a PNG image. You can get raw image data using gst.Caps("video/x-raw-rgb,bpp=24,depth=24") or something like that.

Note that in GStreamer 1.0 (as opposed to 0.10), playbin2 has been renamed to playbin and the convert-frame signal is named convert-sample.

The mechanics of seeking are explained in this chapter of the GStreamer Application Development Manual. The 0.10 playbin2 documentation no longer seems to be online, but the documentation for 1.0 is here.

Solution 2:

An example in Vala, with GStreamer 1.0 :

var playbin = Gst.ElementFactory.make ("playbin", null);
playbin.set ("uri", "file:///path/to/file");
// some code here.var caps = Gst.Caps.from_string("image/png");
Gst.Sample sample;
Signal.emit_by_name(playbin, "convert-sample", caps, out sample);
if(sample == null)
    return;
var sample_caps = sample.get_caps ();
if(sample_caps == null)
    return;
unowned Gst.Structure structure = sample_caps.get_structure(0);
int width = (int)structure.get_value ("width");
int height = (int)structure.get_value ("height");
var memory = sample.get_buffer().get_memory (0);
Gst.MapInfo info;
memory.map (out info, Gst.MapFlags.READ);
uint8[] data = info.data;

Solution 3:

It's an old question but I still haven't found it documented anywhere. I found that the following worked on a playing video with Gstreamer 1.0

import gi
import time
gi.require_version('Gst', '1.0')
from gi.repository import Gst

defget_frame():
    caps = Gst.Caps('image/png')
    pipeline = Gst.ElementFactory.make("playbin", "playbin")
    pipeline.set_property('uri','file:///home/rolf/GWPE.mp4')
    pipeline.set_state(Gst.State.PLAYING)
    #Allow time for it to start
    time.sleep(0.5)
    # jump 30 seconds
    seek_time = 30 * Gst.SECOND
    pipeline.seek(1.0, Gst.Format.TIME,(Gst.SeekFlags.FLUSH | Gst.SeekFlags.ACCURATE),Gst.SeekType.SET, seek_time , Gst.SeekType.NONE, -1)

    #Allow video to run to prove it's working, then take snapshot
    time.sleep(1)
    buffer = pipeline.emit('convert-sample', caps)
    buff = buffer.get_buffer()
    result, map = buff.map(Gst.MapFlags.READ)
    if result:
        data = map.data
        pipeline.set_state(Gst.State.NULL)
        return data
    else:
        returnif __name__ == '__main__':
    Gst.init(None)
    image = get_frame()
    withopen('frame.png', 'wb') as snapshot:
        snapshot.write(image)

The code should run with both Python2 and Python3, I hope it helps someone.

Solution 4:

Use playbin2. set the uri to the media file, use gst_element_seek_simple to seek to the desired time position and then use g_signal_emit to invoke the "convert-frame" action signal.

Post a Comment for "How To Create Video Thumbnails With Python And Gstreamer"