Wide Angle Edimax Camera

A 2006 summer project

Home Project Overview Controlling the Edimax IC-1000 Custom Firmware on the BR6104K Cross compilation Software Conclusions Links Contact

The software developed to do image composition on the BR6104K is fairly simple. It consists of two major pieces: at one end, a thread continously retrieves images from the cameras attached to the router and composes them into one larger image (see cam_comp.c, imagecomp.c). At the other end, imaged.c accepts connections on port 4321 of the WAN interface, and sends back four bytes (two filesize bytes, and two junk bytes, just like a single camera) and then the panoramic image itself any time the request sequence "0110" is received. These all compile into one large executable, imaged (an image daemon), which should be functional on any mipsel device.

The thread for retrieving images from the camera is not particularly complicated. All the cameras will take pictures at roughly the same time, one after another. The composition is done immediately after the four pictures are taken. Doing this all in the same thread helps avoid synchronization issues, and actually frees up enough system resources to speed up the image composition a little bit.

The image composition is currently very, very simplistic. It uses the IJG jpeg library (libjpeg) to open the four pictures received by the camera threads and read them into a large struct of RGB data. Each image comes in from the camera sideways, so when placed in the RGB struct, the image data is copied from rows into columns, and thereby rotated. The RGB struct is then compressed and written out to a JPEG file, which is available for the server thread to send to any client that requests it. So, the final image is basically each of the individual images from the cameras placed side-by-side in the appropriate order.

The original intention of the project was to do much more complicated image processing within the router. However, due to hardware limitations which became evident when the simple image composition software was made, the actual image stitching algorithm was never implemented on the router. It was successfully implemented on a computer, however, in Matlab. After calibrating the four cameras, extrinsic parameters were determined for each camera relative to a fixed camera some distance behind the device. With information on the cameras' relative positions available, a correspondence between pixels in an ideal 960x320 cylindrically projected panorama and real pixels in one of the four available images was established. The seams in generated panoramas are noticeable, but not exaggerated, and as the image composition can't be implemented on the router anyway, this is largely just a proof of concept. An untouched image from the router looks like this, whereas a composed panorama should look something like this. The error in the middle seam probably has to do with slight error in the calculated extrinsic parameters.