[meta-freescale] Gstreamer pipeline problem

Thomas Senyk thomas.senyk at pelagicore.com
Mon Jul 15 10:11:21 PDT 2013


On Monday, 15 July, 2013 17:41:17 Chris Tapp wrote:
> On 15 Jul 2013, at 09:24, Thomas Senyk wrote:
> > On Friday, 12 July, 2013 17:35:01 Chris Tapp wrote:
> >> Hi Thomas,
> >> 
> >> Some more info below:
> >> 
> >> On 11 Jul 2013, at 09:39, Thomas Senyk wrote:
> >>> On Wednesday, 10 July, 2013 20:53:58 Chris Tapp wrote:
> >>>> On 10 Jul 2013, at 20:19, Chris Tapp wrote:
> >>>>> I've got an application which uses playbin2 to capture video. The
> >>>>> pipeline
> >>>>> is of the form:
> >>>>> 
> >>>>> playbin2 uri=... video-sink="queue2 ! videoscale ! video/x-raw-rgb,
> >>>>> pixel-aspect-ratio=1/1, width=<capture-width>, height=<capture-height>
> >>>>> !
> >>>>> fakesink"
> >>>>> 
> >>>>> I then get the "frame" property from the pipeline and use this to grab
> >>>>> the
> >>>>> latest frame.
> >>>>> 
> >>>>> This works on my development system (Ubuntu 11.10) and a Cedar Trail /
> >>>>> Yocto system, but the pipeline fails on the Wandboard Quad. I think
> >>>>> this
> >>>>> is related to:
> >>>>> 
> >>>>> 0:00:13.028151336  1349 0x4442d520 WARN           basetransform
> >>>>> /media/SSD-RAID/build-danny-wandboard/tmp/work/armv7a-vfp-neon-poky-li
> >>>>> nu
> >>>>> x
> >>>>> -gnueabi/gstreamer/0.10.36-r2/gstreamer-0.10.36/libs/gst/base/gstbaset
> >>>>> ra
> >>>>> ns
> >>>>> form.c:1304:gst_base_transform_setcaps:<videoscale0x2ab820> transform
> >>>>> could not transform video/x-raw-yuv, width=(int)854, height=(int)480,
> >>>>> framerate=(fraction)24/1, format=(fourcc)I420,
> >>>>> interlaced=(boolean)false
> >>>>> in anything we support
> >>>>> 
> >>>>> I added an ffmpegcolorspace element betwween the queue2 and the
> >>>>> videoscale
> >>>>> to get round this and the pipeline now builds, but only a few frames
> >>>>> are
> >>>>> captured. There are different diagnostics showing:
> >>>>> 
> >>>>> 0:00:02.881403000  1361   0x28da60 WARN                  vpudec
> >>>>> vpudec.c:914:gst_vpudec_core_create_and_register_frames: Allocate
> >>>>> Internal framebuffers!!!! Message Callback : Element playbin0x250b68
> >>>>> changed state from READY to PAUSED. 0:00:03.237675000  1361   0x28da60
> >>>>> WARN                  vpudec vpudec.c:1578:gst_vpudec_chain: Got no
> >>>>> frame
> >>>>> buffer message, return 0x89, 8 frames in displaying queue!!
> >>>>> 0:00:03.242324334  1361   0x28da60 WARN                  vpudec
> >>>>> vpudec.c:1578:gst_vpudec_chain: Got no frame buffer message, return
> >>>>> 0x89,
> >>>>> 8 frames in displaying queue!!
> >>>>> 
> >>>>> <lots of repeats>
> >>>>> 
> >>>>> 0:00:08.499914334  1382   0x28d860 WARN                  vpudec
> >>>>> vpudec.c:1655:gst_vpudec_chain: Retry too many times, maybe BUG!!
> >>>>> 0:00:08.500784667  1382   0x28d860 WARN                  vpudec
> >>>>> vpudec.c:1578:gst_vpudec_chain: Got no frame buffer message, return
> >>>>> 0x88,
> >>>>> 8 frames in displaying queue!!
> >>>>> 
> >>>>> <lots of repeats>
> >>>>> 
> >>>>> Message Callback : Element playbin0x250aa0 changed state from PAUSED
> >>>>> to
> >>>>> PLAYING. 0:00:09.253202667  1382   0x28d860 WARN                 
> >>>>> vpudec
> >>>>> vpudec.c:1578:gst_vpudec_chain: Got no frame buffer message, return
> >>>>> 0x88,
> >>>>> 8 frames in displaying queue!!
> >>>>> 
> >>>>> 0:00:13.364523335  1460   0x142ec0 WARN             mfw_v4lsink
> >>>>> mfw_gst_v4l_buffer.c:435:mfw_gst_v4l2_new_buffer: Try new buffer
> >>>>> failed,
> >>>>> ret 2 No such file or directory queued 0
> >>>>> 
> >>>>> 
> >>>>> The "Message Callback" events are my own logging to try and see what's
> >>>>> happening in my app.
> >>>>> 
> >>>>> Is this something I'm doing wrong, or are these messages a real issue
> >>>>> somewhere?
> >>>> 
> >>>> This is when playing a .webm. The results for an .flv are as expected.
> >>> 
> >>> is it the same when you use v4l2 sink instead of fakesink?
> >>> Is playbin (without defining the pipeline) behaving the same?
> >> 
> >> I've run some tests using pipelines of the form:
> >>   gst-launch playbin2=file://trailer.webm video-sink="<a-sink-to-test>"
> >> 
> >> GST_DEBUG was set to "*:2"
> >> 
> >> 1) When the sink is set to "mfw_4vlsink" I get full screen video
> >> playback;
> >> 2) When the sink is set to "fakesink" I get nothing on the screen (as
> >> expected), and no debug of interest; 3) When the sink is set to "queue2 !
> >> mfw_4vlsink" I get very choppy full screen video playback (which
> >> eventually
> >> stalls) and debug output as originally reported; 4) When the sink is set
> >> to
> >> "queue2 ! fakesink" I get nothing on the screen (as expected), and no
> >> debug
> >> of interest; 5) When the sink is set to "queue2 ! fakesink sync=1" I get
> >> nothing on the screen and debug output as originally reported.
> >> 
> >> These were all run from the command line, no 'X' running.
> >> 
> >> It looks like queue2 + sync=1 (default for v4lsink) combination causes
> >> the
> >> problem... Setting sync=0 for v4lsink makes no difference and looks to be
> >> ignored.
> > 
> > Do you actually need queue2 or sync=1?
> > 
> > Your end goal is to get everything into your opengl context and render it,
> > right?
> > Shouldn't you be good with setting up a minimal gstreamer pipeline and let
> > gstreamer handle the rest?
> 
> I normally run a basic playbin2 pipeline to 'fakesink'. When it's time to
> render I grab the latest frame via playbin2, upload this into a texture and
> render.

ah ok, then I see two possible setups.

either: 
<decoder pipeline> ! glupload ! fakesink ! <your application using the tex ids 
from glupload coming in with GstBuffer>

or: 
<decoder pipeline> ! fakesink ! <your application maps the memory, coming in 
with GstBuffer, using glTexDirectVIVMap>


... using fakesink 'in front' of your application is a means to get access to 
the GstBuffer objects, right? (rather then haven to implement the pad and 
everything for a actually sink, you just have to implement one call-back 
function for 'handoff'-signal).


Do I get you right?

If so, I advice you to do the 'or', it's not much code and you got all the 
opengl parts in your hand.
There shouldn't be any trade-off for this .. besides having to write hardware-
specific code.



> 
> I seem to need queue2 on the video-sink and audio-sink to keep them in sync.
> Video sinks generally set sync=1 by default, but fakesink doesn't. If this
> isn't set, then the video plays too fast and loses sync with the audio.

ok, never looked any lipsync so far (probably should have...) so can't tell, 
therefor never tried to fix it either.

> > Don't get me wrong. I'm not an gstreamer expert :)
> > I'm just trying to give any (possibly) useful hint I have :)
> 
> Me neither. Any hints are more than welcome ;-)
> 
> >> Things also 'get confused' at times and 1) doesn't work anymore without a
> >> restart.
> > 
> > I've seen this as well .. the framebuffer gets black and nothing seems to
> > be able to render to it anymore ... I think for me it's something with
> > vpu vs. gpu memory (which is the same 'allocation' on the unified-memory)
> > 
> > See:
> > https://community.freescale.com/docs/DOC-93591
> > 
> > Do you see any allocation errors?
> > I'm going to play with 'gpumem' today to see if I can get better/stable
> > results.
> > 
> >> Does any of that help? ;-)
> > 
> > Another thing I spotted:
> > I just looked at your original mail and from the work I've done last week
> > I
> > can tell that I've never seen* or therefor used  x-raw-rgb
> > * 'seen' as in: source file having rgb format
> > 
> > For me there was no need ... I'm having a sink (Qt/c++ code) which takes
> > x-
> > raw-yuv ... this code give the frame over to the opengl-scenegraph and
> > then I map this memory with code like:
> > 
> > void *bits = (void*)mFrame.bits();
> > GLuint physical = ~0U;
> > glTexDirectVIVMap_LOCALE(GL_TEXTURE_2D, w, h, GL_VIV_YV12, &bits,
> > &physical);
> > 
> > 
> > This is then directly used in the shaders without any YUV->RGB code. The
> > texture units knows YUV and does the texture-coordinate/color-lookup
> > accordingly.
> > ... what I'm trying to say: you don't have to convert to GL_RGB(A).
> > 
> >> Chris Tapp
> >> 
> >> opensource at keylevel.com
> >> www.keylevel.com
> 
> Chris Tapp
> 
> opensource at keylevel.com
> www.keylevel.com



More information about the meta-freescale mailing list