[meta-freescale] Gstreamer pipeline problem

Chris Tapp opensource at keylevel.com
Mon Jul 15 13:58:56 PDT 2013


On 15 Jul 2013, at 18:11, Thomas Senyk wrote:

<SNIP>

>> 
>> I normally run a basic playbin2 pipeline to 'fakesink'. When it's time to
>> render I grab the latest frame via playbin2, upload this into a texture and
>> render.
> 
> ah ok, then I see two possible setups.
> 
> either: 
> <decoder pipeline> ! glupload ! fakesink ! <your application using the tex ids 
> from glupload coming in with GstBuffer>
> 
> or: 
> <decoder pipeline> ! fakesink ! <your application maps the memory, coming in 
> with GstBuffer, using glTexDirectVIVMap>
> 
> 
> ... using fakesink 'in front' of your application is a means to get access to 
> the GstBuffer objects, right? (rather then haven to implement the pad and 
> everything for a actually sink, you just have to implement one call-back 
> function for 'handoff'-signal).
> 
> Do I get you right?

Nearly. What I've got is basically just:

playbin2 uri="..." video-sink="fakesink"

So, not a lot going on! I then use the "frame" property of playbin2 to get the most recently rendered frame, which I then glTexImage2D.

I really do it this way so I can easily control where the frame is rendered and get it in sync with my rendering loop (which has lots of other animation taking place).

> 
> If so, I advice you to do the 'or', it's not much code and you got all the 
> opengl parts in your hand.
> There shouldn't be any trade-off for this .. besides having to write hardware-
> specific code.

That's the real catch - I'm trying to write code which I can use on multiple platforms.

>> 
>> I seem to need queue2 on the video-sink and audio-sink to keep them in sync.
>> Video sinks generally set sync=1 by default, but fakesink doesn't. If this
>> isn't set, then the video plays too fast and loses sync with the audio.
> 
> ok, never looked any lipsync so far (probably should have...) so can't tell, 
> therefor never tried to fix it either.

I missed this bit earlier:

>>> Another thing I spotted:
>>> I just looked at your original mail and from the work I've done last week
>>> I
>>> can tell that I've never seen* or therefor used  x-raw-rgb
>>> * 'seen' as in: source file having rgb format
>>> 
>>> For me there was no need ... I'm having a sink (Qt/c++ code) which takes
>>> x-
>>> raw-yuv ... this code give the frame over to the opengl-scenegraph and
>>> then I map this memory with code like:
>>> 
>>> void *bits = (void*)mFrame.bits();
>>> GLuint physical = ~0U;
>>> glTexDirectVIVMap_LOCALE(GL_TEXTURE_2D, w, h, GL_VIV_YV12, &bits,
>>> &physical);
>>> 
>>> This is then directly used in the shaders without any YUV->RGB code. The
>>> texture units knows YUV and does the texture-coordinate/color-lookup
>>> accordingly.
>>> ... what I'm trying to say: you don't have to convert to GL_RGB(A).

That would be nice if it can be done portably.

Chris Tapp

opensource at keylevel.com
www.keylevel.com






More information about the meta-freescale mailing list