Hello all, I am fairly familiar with Linux. I have currently been trying to use a program claled MotionEye (which is a web frontend similar to the "motion" project) but have run into a few problems. I am not sure how much of them are due to MotionEye, perhaps none, but I am also interested in giving Zoneminder a go for comparison.
Here is my current hardware:
3x Pogoplug v4 (~800MHz CPU 128mb ram)
2x Raspberry Pi 1 model B (the first popular raspberry pi with ~700MHz cpu)
2x wireless G USB 2.0 dongles
1x wireless N USB 2.0 dongles
1x Asus eeePC 901 netbook (1.6GHz Atom CPU, 2gig ram, 160 hdd)
3x cheap, USB2.0, 640x480 webcams capable of yuyv ONLY (no mjpg)
My goal, was to have:
1[(webcam+pogoplug)=ip camera]
2[(webcam+pogoplug)=ip camera]-> wireless->router->AsusEEE_PC
3[(webcam+pogoplug)=ip camera]
My thought was that I would have three IP cameras connected to the EEE_PC which would be running Zoneminder/motioneye. However, I have playing with mjpg-streamer to serve the IP Webstream and because my webcams only output YUYV, the stream maxes out the CPU power of the Pogoplug. This in turn ruins the framerate and makes the video laggy with constant freezing.
I think that it is possible to reduce the processing power of the pogoplug, if I serve the uncompressed video feed instead. But, from what I have read, uncompressed video feeds take up a ton of network bandwidth.
1. Would it make sense for me to hook everything up to a separate network to keep my local area network traffic optimal?
2. Does anyone have any suggestions or recommendations given my setup? I have also considered running ethernet cable to each pogoplug to provide PoE and data instead of the wifi.