Using ISI (Image Sensor Interface) in Linux4SAM 6.0 and later
Introduction
This page is mainly about how to enable and configure the ISI in AT91SAM SoCs for different image sensors, in our
Linux4SAM 6.0 and later releases.
For older releases and older kernel, check
this page.
The Image Sensor Interface (ISI) connects a CMOS-type image sensor to the processor and provides image capture in various formats from sensor side.
ISI uses H/VSYNC signal for synchronization or EAV/SAV.
ISI supports the following sensor input formats : YCbCr422, RGB565, RGB888 and grayscale raw data.
ISI supports the following output formats:
- ISI has two paths (Preview path and Codec path) for the output.
- Preview path will output RGB data with different format.
- Can convert YCbCr or YUV to RGB.
- Support downscale and decimation.
- Max output solution is 640x480.
- Codec path will output YUV data with different orders.
- Can convert RGB to YCrCb.
- Max output solution is 2048x2048.
Prerequisites
-
- In SAMA5D3x-EK boards:
- J11's PIN29 need to be disconnected when insert ISI module board. Since PIN29 is connected to ISI_D11 (pin mux as TWD1).
- TWI0 (i2c0)'s TWD0 & TWCK0 signals are using same pins as ISI_VSYNC & ISI_HSYNC. So need to disable i2c0 when you enable ISI.
- TWI1 (i2c1)'s TWD1 & TWCK1 signals are using same pins as ISI_D11 & ISI_D10. So it cannot support 12bit data input.
- Led d3 should be disable as well as it conflict with camera sensor's reset pin PE24.
- These products have been deprecated in linux4sam releases.
- The main board for ISI feature display is SAM9X60-EK with the SAM9X60 MPU.
- Other MPUs or other boards do not support ISI or have been deprecated in linux4sam.
- Supported CMOS sensors: Omivision OV2640 , Omnivision OV7740 and Micron/Aptina monochrome sensor MT9V022
Tips : The ISI Device Tree support is added as an overlay in our dt-overlay-at91 tree. For any questions check our
DT-overlay page .
Detail description of software
ISI driver is part of the media platform drivers, and it supports the standard v4L2 APIs.
Current supported sensors: Omivision OV2640 , Omnivision OV7740 and Micron/Aptina monochrome sensor MT9V022.
It's easy to support a sensor if the sensor is using DVP interface connection. This means it uses H/VSYNC signals. Sensors with
serial interface are not supported, only
parallel.
soc-camera framework was deprecated. There are still sensors in soc_camera, but most sensors were ported to the platform camera framework. Thus we do not recommend using soc_camera sensors.
Supported sensors are available in platform camera framework. You can find all camera supported platform sensors in Kernel menuconfig by:
- select the menu:
"Device Drivers -> Multimedia support -> Media Controller API"
- select the menu:
"Device Drivers -> Multimedia support -> V4L2 sub-device userspace API"
- deselect the menu:
"Device Drivers -> Multimedia support -> Autoselect ancillary drivers (tuners, sensors, i2c, frontends)"
- Then all support sensors can be found in the menu:
"Device Drivers -> Multimedia support -> I2C Encoders, decoders, sensors and other helper chips"
Tips: to add a new sensor support, you need to create a Device Tree Overlay file to add the sensor remote port. The
DT-overlay page provides more information. Our public Github repository will gladly accept patches.
fswebcam is a neat and simple webcam app. It captures images from a V4L1/V4L2 compatible device or file, averages them to reduce noise and draws a caption using the GD Graphics Library which also handles compressing the image to PNG or JPEG. The resulting image is saved to a file or sent to stdio where it can be piped to something like ncftpput or scp.
- Add fswebcam in Buildroot
- Select
"Package Selection for the target -> Graphic libraries and applications -> fswebcam"
.
- Use fswebcam to capture a image.
#!/bin/sh
VIDEO_DEV=/dev/video0
SKIP_FRAMES=20
# test preview channel
fswebcam -S ${SKIP_FRAMES} -d ${VIDEO_DEV} -p RGB565 -r 640x480 rgb565.jpg
fswebcam -S ${SKIP_FRAMES} -d ${VIDEO_DEV} -p RGB565 -r 320x240 rgb565_defactor.jpg
# test codec channel
fswebcam -S ${SKIP_FRAMES} -d ${VIDEO_DEV} -p YUYV -r 640x480 yuyv.jpg
fswebcam -S ${SKIP_FRAMES} -d ${VIDEO_DEV} -p YUYV -r 800x600 yuyv_800x600.jpg
fswebcam -S ${SKIP_FRAMES} -d ${VIDEO_DEV} -p UYVY -r 640x480 uyvy.jpg
fswebcam -S ${SKIP_FRAMES} -d ${VIDEO_DEV} -p UYVY -r 800x600 uyvy_800x600.jpg
# test codec channel, without any processing, GREY, or Bayer RGB.
fswebcam -S ${SKIP_FRAMES} -d ${VIDEO_DEV} -p BAYER -r 640x480 bayer_bggr8.jpg
fswebcam -S ${SKIP_FRAMES} -d ${VIDEO_DEV} -p SGRBG8 -r 640x480 bayer_grbg8.jpg
-
-S
: frames that need to skip.
-
-d /dev/video0
: specify the ISI as the input source.
-
-p
: pixel format, can be RGB565, YUYV, UYVY, BAYER, SGRBGB8 and etc.
-
-r
: resolution.
FFmpeg is a complete, cross-platform solution to record, convert and stream audio and video. It supports video4linux2 in Linux.
Tips: As no
vcodec
specified, it use
mpeg4
as default.
Tips: Run
ffmpeg -pix_fmts
can show all the supported pixel formats.
GStreamer is a library for constructing graphs of media-handling components. The applications it supports range from simple Ogg/Vorbis playback, audio/video streaming to complex audio (mixing) and video (non-linear editing) processing.
GStreamer has been ported to a wide range of operating systems, processors and compilers.
Tips: run
gst-inspect
will show all installed plugins.
Tips: run
gst-inspect [plugin name]
will show all supported parameters for this plugins.
ZXing barcode reader
ZXing is an open-source, multi-format 1D/2D barcode image processing library implemented in Java, with ports to other languages. The project also includes a barcode reader example.
- Add the ZXing barcode reader application in Buildroot.
- Add ZXing libary in Buildroot (The Linux4SAM buildroot demo already include it).
- Select
"Package Selection for the target -> Libraries -> Graphics -> zxing"
.
- Apply attached patch on top of the buildroot-2012.11.1-at91.
- This patch will change the zxing project's Makefile to generate not only zxing library but also the barcode reader example.
- Run command
make zxing
to generate the barcode application: zxing_barcode
.
-
zxing_barcode
is located on outpout/target/use/bin/
.
- Read barcode from the image by using ZXing barcode reader.
- Get an picture which include a barcode.
- Please refer to FFmpeg section for the image capture.
- Run following command to reader the barcode.
zxing_barcode *.jpg
FAQ
- Check the boot message whether there is an information about sensor probe.
i2c i2c-0: OV2640 Probed
...
ov5642 0-003c: reg_read: i2c read error, reg: 300a
ov5642: probe of 0-003c failed with error -121
In above example, the message shows an OV2640 sensor is probed. But probe of OV5642 failed as we only support one camera module slot in the board.
- Print all the v4L2 device in system to check that ISI device exists?
# ls /sys/class/video4linux/video*
/sys/class/video4linux/video0:
debug dev index name power subsystem uevent
/sys/class/video4linux/video1:
debug device name subsystem
dev index power uevent
# cat /sys/class/video4linux/video0/name
# cat /sys/class/video4linux/video1/name
isi-camera
In above example, we can find the video1
is the isi-camera
device.
- Check the kernel boot message to see if there is any error message about ISI and sensor
- Following the below check list to troubleshoot above errors.
- Check the kernel config file:
- Is the ISI driver and the sensor driver enabled?
- Check your board's device tree files and overlays (
.dts, .dtsi .dtso
):
- Is the ISI device node is enabled?
- Are the ISI pins are configured correctly?
- Is the sensor's i2c info correct?
- Is the sensor's power/reset pin correct?
- Is the PCK correct?
Reference