[meta-freescale] [meta-fsl-arm] Using gstreamer on Nitrogen6x/SabreLite
Carlos Rafael Giani
dv at pseudoterminal.org
Wed Mar 5 10:06:23 PST 2014
On 2014-03-05 18:59, Eric Nelson wrote:
> Hi Gary,
> On 03/05/2014 10:52 AM, Gary Thomas wrote:
>> On 2014-03-05 10:44, Eric Nelson wrote:
>>> Hi Gary,
>>> On 03/04/2014 08:03 AM, Gary Thomas wrote:
>>>> I have a SabreLite with the OV5642 camera. I'd like to capture
>>>> some video and display it on the screen. Here's my gstreamer
>>>> gst-launch -e -vvv mfw_v4lsrc device=/dev/video0 num-buffers=100
>>>> typefind=true \
>>>> ! "video/x-raw-yuv, format=(fourcc)I420, width=640,
>>>> height=480, framerate=(fraction)30/1" \
>>>> ! ffmpegcolorspace \
>>>> ! ximagesink
>>>> What I don't understand is why the format from mfw_v4lsrc has to
>>>> be I420 when the OV5642 [kernel] driver seems to only support YUYV
>>>> To further confuse, I can grab a frame like this:
>>>> yavta -fYUYV -s640x480 -F -c1 /dev/video0
>>>> and the V4l2 subsystem tells me this sensor is YUYV:
>>>> root at nitrogen6x:~# v4l2-ctl -d /dev/video0 --list-formats
>>>> ioctl: VIDIOC_ENUM_FMT
>>>> Index : 0
>>>> Type : Video Capture
>>>> Pixel Format: 'YUYV'
>>>> Name :
>>> This appears to be a bug in the video driver (mxc_v4l2_capture) with
>>> respect to enumeration (not a strong point for the drivers).
>>> The driver (mxc_v4l2_capture) appears to support a wide variety of
>>> pixel formats through the magic of the IPU.
>>> The mfw_v4lsrc plugin appears to be hard-coded to support only
>>> UYVY and I420 on i.MX6 though (and NV12 on i.MX51).
>>> Note that your pipeline is very expensive, and would benefit from
>>> using a sink that can support YUV natively (mfw_isink, mfw_v4lsink)
>>> or that can do the conversion in hardware (glimagesink).
>> Thanks. I only used ximagesink as an example (one that also
>> works on the desktop) for others to see. In the end, the video
>> will probably be packaged into some container format (.mp4) and/or
>> streamed, so those other methods don't help much.
>> Is it possible to use the v4l2src gstreamer element with this video?
>> I haven't been able to make that work at all...
> Not without some hacking of the kernel driver(s), and after that,
> you'd need to figure out how to handle allocation of DMA'able
> buffers, and so on.
> If you were to undertake that, I'd recommend taking a hard look
> at Carlos's GStreamer-1.0 in the process, since I understand that
> the buffer-chaining process is different in 1.0 (and cleaner in 1.0).
Philip Craig kindly submitted patches to the gstreamer-imx project to
add a v4l2src element that uses i.MX6 specifics. And yes, the allocation
scheme is much cleaner in GStreamer 1.0. In 0.10, it was only possible
through hacks. I implemented an imxeglvivsink in gstreamer-imx, which is
a video sink that outputs YUV frames located in DMA buffers with OpenGL
ES and the Vivante GPU drivers' direct texture extension. This extension
allows for using DMA buffers as pixel source for OpenGL ES textures. Net
result is that the frames are shown directly without involving a CPU
copy. I usually measure ~7% CPU usage for typical 1080p videos , more if
some nontrivial audio encoding has been used (like AC3).
ximagesink should never be used for video (unless it is the only sink
that is available). When possible, either use a platform specific sink,
like the mfw ones for 0.10 of imxeglvivsink in my 1.0 plugins, or at
least xvimagesink. (But I realize that XVideo is not exposed in the X11
More information about the meta-freescale