我想在使用以下代码将视频帧转换为opengl纹理之前对其进行颜色空间转换:
struct SwsContext * pSwsCtx = sws_getCachedContext(NULL,width, height, codec->pix_fmt, width, height, AV_PIX_FMT_RGBA, SWS_POINT, NULL, NULL, NULL);
每一次 sws_getCachedContext() 调用函数我得到以下警告:
[swscaler @ 0x10506fa00] deprecated pixel format used, make sure you did set range correctly
这是我的 ffmpeg的 输出版本信息:
ffmpeg version 2.2 Copyright (c) 2000-2014 the FFmpeg developers
built on Mar 26 2014 15:29:01 with Apple LLVM version 5.1 (clang-503.0.38) (based on LLVM 3.4svn)
configuration: --prefix=/usr/local/Cellar/ffmpeg/2.2 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-nonfree --enable-hardcoded-tables --enable-avresample --enable-vda --cc=clang --host-cflags= --host-ldflags= --enable-libx264 --enable-libfaac --enable-libmp3lame --enable-libxvid
libavutil 52. 66.100 / 52. 66.100
libavcodec 55. 52.102 / 55. 52.102
libavformat 55. 33.100 / 55. 33.100
libavdevice 55. 10.100 / 55. 10.100
libavfilter 4. 2.100 / 4. 2.100
libavresample 1. 2. 0 / 1. 2. 0
libswscale 2. 5.102 / 2. 5.102
libswresample 0. 18.100 / 0. 18.100
libpostproc 52. 3.100 / 52. 3.100
Hyper fast Audio and Video encoder
有没有想过禁用此警告?如何正确设置颜色范围?
看来你正在努力阅读 AV_PIX_FMT_YUVJXXXP
不推荐使用的框架(参见 libav doc)。您可以使用此解决方法来管理它:
AVPixelFormat pixFormat;
switch (_videoStream->codec->pix_fmt) {
case AV_PIX_FMT_YUVJ420P :
pixFormat = AV_PIX_FMT_YUV420P;
break;
case AV_PIX_FMT_YUVJ422P :
pixFormat = AV_PIX_FMT_YUV422P;
break;
case AV_PIX_FMT_YUVJ444P :
pixFormat = AV_PIX_FMT_YUV444P;
break;
case AV_PIX_FMT_YUVJ440P :
pixFormat = AV_PIX_FMT_YUV440P;
break;
default:
pixFormat = _videoStream->codec->codec->pix_fmts;
break;
}
问题问题已经很久了,但是当我遇到同样的问题时,我看着它并尝试找到第二部分的答案(如何正确设置颜色范围?)。我正在扩展Thomas Ayoub的答案:
AVCodecContext* pCodecCtx = _videoStream->codec;
AVPixelFormat pixFormat;
switch (pCodecCtx->pix_fmt)
{
case AV_PIX_FMT_YUVJ420P:
pixFormat = AV_PIX_FMT_YUV420P;
break;
case AV_PIX_FMT_YUVJ422P:
pixFormat = AV_PIX_FMT_YUV422P;
break;
case AV_PIX_FMT_YUVJ444P:
pixFormat = AV_PIX_FMT_YUV444P;
break;
case AV_PIX_FMT_YUVJ440P:
pixFormat = AV_PIX_FMT_YUV440P;
break;
default:
pixFormat = pCodecCtx->pix_fmt;
}
// initialize SWS context for software scaling
SwsContext *swsCtx = sws_getContext(pCodecCtx->width, pCodecCtx->height, pixFormat, pCodecCtx->width, pCodecCtx->height, AV_PIX_FMT_RGB24, SWS_BILINEAR, NULL, NULL, NULL);
// change the range of input data by first reading the current color space and then setting it's range as yuvj.
int dummy[4];
int srcRange, dstRange;
int brightness, contrast, saturation;
sws_getColorspaceDetails(swsCtx, (int**)&dummy, &srcRange, (int**)&dummy, &dstRange, &brightness, &contrast, &saturation);
const int* coefs = sws_getCoefficients(SWS_CS_DEFAULT);
srcRange = 1; // this marks that values are according to yuvj
sws_setColorspaceDetails(swsCtx, coefs, srcRange, coefs, dstRange,
brightness, contrast, saturation);
范围是什么? YUV像素格式的值范围为:Y 16..235,UV 16..240。 YUVJ扩展一个,所有YUV都是0 ... 255.所以设置
srcRange = 1
强制libav使用扩展输入数据范围。如果你没有在范围内做任何改变,那么你可能只会遇到夸张的对比。它仍然应该将输入数据缩放到RGB颜色空间。