Integrating multiple camera streams

Hi -
I'm very new to Parallela and I have a general idea for a project but I wanted to ask if there was a simpler mechanism than what I've been planning.
I need to be able to integrate multiple video streams - either 4 or 6 of them into a larger video stream. The cameras field of view will overlap (slightly) and the integrated view will be a single merged view.
Within the merged view I may do some image processing (limited feature recognition, some geometry corrections) or compression (probably wavelet).
The simplest way I can see to do this seems a bit kludgy to me: I could have multiple Raspberry Pi computers (and cameras) that pushed the images over ethernet to the Parallela. The Parallela could stitch these images together and do the other image manipulation in a pipeline. The frame rate and image size could be varied depending on what the throughput is. Perhaps some code could run on the Pi's GPUs and some on the Parallela - I'm still trying to think this through.
I think something like this might work - but from a hardware point of view it's a lot of cables and all of the devices together would draw a fair amount of power. Is there something more elegant than this for getting multiple I/O streams than separate CPUs and ethernet? I don't know if there are daughterboard solutions for this but I'd be interested in learning more.
Thanks
Sean
I'm very new to Parallela and I have a general idea for a project but I wanted to ask if there was a simpler mechanism than what I've been planning.
I need to be able to integrate multiple video streams - either 4 or 6 of them into a larger video stream. The cameras field of view will overlap (slightly) and the integrated view will be a single merged view.
Within the merged view I may do some image processing (limited feature recognition, some geometry corrections) or compression (probably wavelet).
The simplest way I can see to do this seems a bit kludgy to me: I could have multiple Raspberry Pi computers (and cameras) that pushed the images over ethernet to the Parallela. The Parallela could stitch these images together and do the other image manipulation in a pipeline. The frame rate and image size could be varied depending on what the throughput is. Perhaps some code could run on the Pi's GPUs and some on the Parallela - I'm still trying to think this through.
I think something like this might work - but from a hardware point of view it's a lot of cables and all of the devices together would draw a fair amount of power. Is there something more elegant than this for getting multiple I/O streams than separate CPUs and ethernet? I don't know if there are daughterboard solutions for this but I'd be interested in learning more.
Thanks
Sean