安装opencv
新建文件camera_real_time_monitor.py(实时监控),内容:
import cv2 #导入cv2扩展包
cap=cv2.VideoCapture(0)#读取摄像头信号
while cap.isOpened():#摄像头信号存在时循环
(ret,frame)=cap.read()#将读取的信号中的1帧存为frame变量
cv2.imshow('test',frame)#预览帧
cv2.waitKey(5)#每帧等待5毫秒
MJPG-streamer实时监控
首先安装相应的依赖库
sudo apt-get install subversion libjpeg8-dev imagemagick libv4l-dev cmake git
从github上 下载 MJPG-streamer,并编译, 安装
git clone https://github.com/jacksonliam/mjpg-streamer.git
cd mjpg-streamer/mjpg-streamer-experimental/
make all
sudo make install
接上普通的免驱动USB摄像头,执行命令
./mjpg_streamer -i "./input_uvc.so" -o "./output_http.so -w ./www"
如果是 Picamera则使用以下命令
./mjpg_streamer -i "./input_raspicam.so" -o "./output_http.so -w ./www"
./mjpg_streamer -i "./input_raspicam.so -vs -x 320 -y 240" -o "./output_http.so -w ./www"
-vs动态视频帧率
-x -y视频宽度高度
浏览器推荐使用谷歌浏览器
浏览器输入IP地址:8080 可以看到视频
ps -ef 查看进程
kill -s 9 端口号 强制杀死进程
MJPG-streamer开机自启:
在mjpg-streamer/mjpg-streamer-experimental里(或别的文件夹,自己改)建立文件 open_mjpg_streamer.sh,我直接使用树莓派的picamera获取图像,所以参数是input_raspicam.so,使用别的需要改参数,注意空格,-vs自动稳定帧率,-x图像宽度,-y图像高度
./mjpg_streamer -i "./input_raspicam.so -vs -x 320 -y 240" -o "./output_http.so -w ./www"
echo "mjpg_streamer is runing"
打开/etc/rc.local
sudo nano /etc/rc.local
在exit 0前写入以下内容
cd /home/pi/Desktop/python/mjpg-streamer/mjpg-streamer-experimental/
sh open_mjpg_streamer.sh $
cd -
#在MJPG-streamer调用人脸识别代码(要先建文件filter.py,详见https://github.com/jacksonliam/mjpg-streamer/blob/master/mjpg-streamer-experimental/plugins/input_opencv/filters/cvfilter_py/example_filter.py),fargs后面是filter.py的绝对路径
./mjpg_streamer
-i "./input_opencv.so -filter cvfilter_py.so -fargs /home/pi/face_detect_orgin/face_detect_orgin/filter.py -r 320x240"
-o "./output_http.so -w ./www"