UVC之MJPEG流
背景
- Platform: IMX6Q
- OS: Linux-QT5
- Kernel: 4.1.15
问题
因项目需要,以前产品(IMX6Q,QT),客户觉得摄像头(OV5640)的效果不行,但以前的硬件的摄像头接口只支持DVP,现在很难找得到匹配的摄像头,且如果换摄像头,驱动调试等周期太长,所以打算使用USB摄像头。去深圳华强那边搜罗了一圈,找到了几个需要评估调试。
大致参数:
1. YUYV(YUV 4:2:2 (YUYV))
discrete: 640x480: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 160x120: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 320x240: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 352x288: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 800x600: 1/20 1/15 1/10 1/5
discrete: 960x540: 1/15 1/10 1/5
discrete: 1024x768: 1/10 1/5
discrete: 1280x720: 1/9 1/5
discrete: 1280x960: 1/7 1/5
discrete: 1920x1080: 1/5
2. MJPG(MJPEG)
discrete: 640x480: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 160x120: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 320x240: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 352x288: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 800x600: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 960x540: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 1024x768: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 1280x720: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 1280x960: 1/30 1/25 1/20 1/15 1/10 1/5
discrete: 1920x1080: 1/30 1/25 1/20 1/15 1/10 1/5
很明显,YUV的裸流支持不了720P/30帧,所以只能采用MJPEG模式
过程
以前的流程:
本地DVP摄像头=>yuv420=>显示或者H264编码发送存储
现在的流程:
UVC摄像头=>MJPEG流=>MJEPG解码=>NV16=>色彩空间转换=>NV12(I420)=>显示或者H264编码发送存储
MJPEG流采集
参考以前的文章:https://notes.z-dd.online/2019/05/21/V4L2采集视频/
主要是修改采集格式为 V4L2_PIX_FMT_MJPEG
MJEPG解码
采用之前的 imxvpuapi
,
主要参考代码:
if((ret = imx_vpu_jpeg_dec_decode(ctx->jpeg_dec, in_buff, in_size)) == IMX_VPU_DEC_RETURN_CODE_OK)
{
///查询解码信息
imx_vpu_jpeg_dec_get_info(ctx->jpeg_dec, &info);
///获取缓冲虚拟地址 map
mapped_virtual_address = imx_vpu_dma_buffer_map(info.framebuffer->dma_buffer, IMX_VPU_MAPPING_FLAG_READ);
///色彩格式转化
pixel_format_nv16_to_nv12(mapped_virtual_address, out_buff, info.aligned_frame_width, info.aligned_frame_height);
*w = info.aligned_frame_width;
*h = info.aligned_frame_height;
*out_size = num_out_byte;
//unmap
imx_vpu_dma_buffer_unmap(info.framebuffer->dma_buffer);
imx_vpu_jpeg_dec_frame_finished(ctx->jpeg_dec, info.framebuffer);
色彩空间转换
共尝试了3种方式:
G2D模块色彩格式转换:
g2d can only perform color space converting to RGB space
参考代码:int g2d_nv16_to_rgb(struct g2d_buf *buf_yuv,int in_width,int in_height, struct g2d_buf *buf_rgb,int out_width,int out_height) { struct g2d_surface src,dst; void *g2dHandle = NULL; if(g2d_open(&g2dHandle) == -1 || g2dHandle == NULL) { printf("Fail to open g2d device!\n"); //g2d_free(buf_yuv); return RETVAL_ERROR; } src.planes[0] = buf_yuv->buf_paddr; src.planes[1] = buf_yuv->buf_paddr + in_width * in_height; src.left = 0; src.top = 0; src.right = in_width; src.bottom = in_height; src.stride = in_width; src.width = in_width; src.height = in_height; src.rot = G2D_ROTATION_0; src.format = G2D_NV16; dst.planes[0] = buf_rgb->buf_paddr; dst.left = 0; dst.top = 0; dst.right = out_width; dst.bottom = out_height; dst.stride = out_width; dst.width = out_width; dst.height = out_height; dst.rot = G2D_ROTATION_0; dst.format = G2D_YUYV; g2d_blit(g2dHandle, &src, &dst); g2d_finish(g2dHandle); g2d_close(g2dHandle); return RETVAL_OK; }
IPU色彩格式转换:源格式不支持我们解码之后的
NV16
格式
参考代码:// Open IPU device ctx->fd_ipu = open("/dev/mxc_ipu", O_RDWR, 0); if (ctx->fd_ipu < 0) { printf("open ipu dev fail\n"); return NULL; } // Input image size and format ctx->task.input.width = 1280; ctx->task.input.height = 720; ctx->task.input.format = IPU_PIX_FMT_YUYV; // Output image size and format ctx->task.output.width = 1280; ctx->task.output.height = 720; ctx->task.output.format = IPU_PIX_FMT_YUV420P; //ctx->task.output.rotate = 1; // Calculate the output size ctx->osize = ctx->task.output.paddr = ctx->task.output.width * ctx->task.output.height * fmt_to_bpp(ctx->task.output.format)/8; // Calculate input size from image dimensions and bits-per-pixel // according to format ctx->isize = ctx->task.input.paddr = ctx->task.input.width * ctx->task.input.height * fmt_to_bpp(ctx->task.input.format)/8; // Allocate contingous physical memory for input image // input.paddr contains the amount needed // this value will be replaced with physical address on success ret = ioctl(ctx->fd_ipu, IPU_ALLOC, &(ctx->task.output.paddr)); if (ret < 0) { printf("ioctl IPU_ALLOC fail!\n"); ret = -1; return ret; } ret = ioctl(ctx->fd_ipu, IPU_ALLOC, &(ctx->task.input.paddr)); if (ret < 0) { printf("ioctl IPU_ALLOC fail!\n"); ret = -1; return ret; } // Create memory map and obtain the allocated memory virtual address ctx->inbuf = mmap(0, ctx->isize, PROT_READ | PROT_WRITE, MAP_SHARED, ctx->fd_ipu, ctx->task.input.paddr); if (!ctx->inbuf) { printf("mmap fail\n"); ret = -1; return ret; } // Create memory map for output image ctx->outbuf = mmap(0, ctx->osize, PROT_READ | PROT_WRITE, MAP_SHARED, ctx->fd_ipu, ctx->task.output.paddr); if (!ctx->outbuf) { printf("mmap fail\n"); ret = -1; return ret; } ///... memcpy(ctx->inbuf, mapped_virtual_address, ctx->isize); //ctx->task.input.paddr = mapped_virtual_address; ctx->inbuf = mmap(0, ctx->isize, PROT_READ | PROT_WRITE, MAP_SHARED, ctx->fd_ipu, ctx->task.input.paddr); if (!ctx->inbuf) { printf("mmap fail\n"); ret = -1; return ret; } // Perform color space conversion ret = ioctl(ctx->fd_ipu, IPU_QUEUE_TASK, &(ctx->task)); if (ret < 0) { printf("ioct IPU_QUEUE_TASK fail %x\n", ret); return ret; } //memcpy(out_buff, mapped_virtual_address, num_out_byte); memcpy(out_buff, ctx->outbuf, ctx->osize); ///... if(ctx->fd_ipu) close(ctx->fd_ipu); if (ctx->outbuf) munmap(ctx->outbuf, ctx->osize); if (ctx->inbuf) munmap(ctx->inbuf, ctx->isize); if (ctx->task.output.paddr) ioctl(ctx->fd_ipu, IPU_FREE, &ctx->task.output.paddr); if (ctx->task.input.paddr) ioctl(ctx->fd_ipu, IPU_FREE, &ctx->task.input.paddr);
纯软件色彩格式转换:灵活,但是耗CPU资源,耗时
参考代码:int pixel_format_nv16_to_nv12(char *nv16_buff, char *nv12_buff, int w, int h) { unsigned char *nv16_y = NULL; unsigned char *nv16_uv = NULL; unsigned char *nv12_y = NULL; unsigned char *nv12_u = NULL; unsigned char *nv12_v = NULL; int i, j, offset; if (!nv16_buff || !nv12_buff || ((w*h) <= 0)) { printf("ERROR: %s input args invalid!\n", __func__); return RETVAL_ERROR; } /* get the right point */ nv16_y = (unsigned char *)nv16_buff; nv16_uv = (unsigned char *)nv16_buff + w * h; nv12_y = (unsigned char *)nv12_buff; nv12_u = (unsigned char *)nv12_buff + w * h; nv12_v = nv12_u + 1; /* copy y dates directly */ memcpy(nv12_y, nv16_y, w * h); /* get nv12_uv dates from nv16_uv * * >>>> nv16 pix formate: * start + 0: Y'00 Y'01 Y'02 Y'03 * start + 4: Y'10 Y'11 Y'12 Y'13 * start + 8: Y'20 Y'21 Y'22 Y'23 * start + 12: Y'30 Y'31 Y'32 Y'33 * start + 16: Cb00 Cr00 Cb01 Cr01 * start + 20: Cb10 Cr10 Cb11 Cr11 * start + 24: Cb20 Cr20 Cb21 Cr21 * start + 28: Cb30 Cr30 Cb31 Cr31 * * >>>> nv12 pix formate: * start + 0: Y'00 Y'01 Y'02 Y'03 * start + 4: Y'10 Y'11 Y'12 Y'13 * start + 8: Y'20 Y'21 Y'22 Y'23 * start + 12: Y'30 Y'31 Y'32 Y'33 * start + 16: Ub00 Vr00 * start + 18: Ub01 Vr01 * start + 20: Ub10 Vr10 * start + 22: Ub11 Vr11 * * nv16的uv分量与 nv12的uv分量对于关系: * 偶数行取Cb值,奇数行取Cr值 * Cb00 Cb01 ----> Ub00 Ub01 * Cr10 Cr11 ----> Vr00 Vr01 * Cb20 Cb21 ----> Ub10 Ub11 * Cr30 Cr31 ----> Vr10 Vr11 */ /* 奇数行取Cb */ offset = 0; for (i = 0; i < h; i+=2) { offset = i * w; for (j = 0; j < w; j+=2) { *nv12_u = *(nv16_uv + offset + j); nv12_u += 2; } } /* 偶数行取Cr */ offset = 0; for (i = 1; i < h; i+=2) { offset = i * w; for (j = 1; j < w; j+=2) { *nv12_v = *(nv16_uv + offset + j); nv12_v += 2; } } return 0; }
结论
- 总过程会增加了2个步骤,会增加耗时和CPU资源,拉低帧率
- 一对一对讲时(2解码1编码)性能及效果需要评估
- USB摄像头稳定性,插拔方式等都会存在较大风险
其他
IMX6Q Android5.1前置与后置摄像头配置
device
目录下 init.i.MX6Q.rc
文件:
# set back camera.
setprop back_camera_name ov5642_camera
# set front camera.
setprop front_camera_name ov5642_camera,uvc,ov5640_camera,ov5640_mipi
系统会依次去尝试检测摄像头,检测到一个就不会往下检测了,顺序很重要
评论