[meta-freescale] OV5640 MIPI-CSI2 driver - why the limitations?

Jacob Pedersen jp at circleconsult.dk
Wed Sep 17 05:24:46 PDT 2014


I've been looking into the implementation (drivers/media/platform/mxc/capture/ov5640_mipi.c), and found that the only difference between the 15 fps and the 30 fps i2c blobs is the value of the 0x3035 register. The 0x3035 is the system clock divider and MIPI PCLK scale divider.

For the 30 fps VGA mode the value is 0x14, and for the 15 fps VGA mode the value is 0x22.

I tried changing the value to 0x12 (divide MIPI PCLK by two instead of four), which should double the frequency and enable 60 fps. That's unfortunately not the case - it's still running at 30 fps. 

What else to I need to change in the driver?

Best,
Jacob

-----Oprindelig meddelelse-----
Fra: angolini at gmail.com [mailto:angolini at gmail.com] På vegne af Daiane Angolini
Sendt: 16. september 2014 16:05
Til: Eric Nelson
Cc: Jacob Pedersen; meta-freescale at yoctoproject.org
Emne: Re: [meta-freescale] OV5640 MIPI-CSI2 driver - why the limitations?

On Tue, Sep 16, 2014 at 11:00 AM, Eric Nelson <eric.nelson at boundarydevices.com> wrote:
> On 09/16/2014 06:43 AM, Daiane Angolini wrote:
>> On Tue, Sep 16, 2014 at 10:25 AM, Jacob Pedersen <jp at circleconsult.dk> wrote:
>>> Hi folks,
>>>
>>>
>>>
>>> I'm using a OmniVision OV5640 MIPI-CSI2 image sensor with a iMX6 
>>> Quad board, and I'm doing some computer vision applications. I'm 
>>> curious to know why the driver for the OV5640 sensor is limited to 
>>> 30 fps, even though the sensor can do 60 fps at 720p and 90 fps at 
>>> VGA? As far as I can see in the driver, the frame rate is clamped to maximum 30 fps.
>>>
>>>
>>>
>>> Is there a technical reason, or is it just because it hasn't been 
>>> updated for all the supported modes?
>>>
>>>
>>>
>>> The driver also limits the available image formats, which makes one 
>>> required to do the conversion in software or (as in my case) using 
>>> the IPU. The IPU is fairly fast to do the simple UYVY to RGB24 
>>> conversion, but still slower than just getting an RGB image from the sensor.
>>
>> This limitation is due to the camera device driver. Only few 
>> configurations are implemented.
>>
>
> Those I2C blobs are pretty difficult to get right!


+1

Daiane


More information about the meta-freescale mailing list