zoukankan      html  css  js  c++  java
  • Android上使用OpenGLES2.0显示YUV数据

    Android上用OpenGLES来显示YUV图像,之所以这样做,是因为:

    1.Android本身也不能直接显示YUV图像,YUV转成RGB还是必要的;

    2.YUV手动转RGB会占用大量的CPU资源,如果以这样的形式播放视频,手机会很热,所以我们尽量让GPU来做这件事;

    3.OpenGLES是Android集成到自身框架里的第三方库,它有很多的可取之处。

    博主的C/C++不是很好,所以整个过程是在Java层实现的,大家见笑,我主要参考(但不限于)以下文章,十分感谢这些朋友的分享:

    1. http://blog.csdn.NET/xiaoguaihai/article/details/8672631

    2.http://chenshun87.blog.163.com/blog/static/18859389201232011727615/

    3.http://blog.csdn.net/ypist/article/details/8950903

    4.http://blog.csdn.net/wanglang3081/article/details/8480281

    5.http://blog.csdn.net/xdljf/article/details/7178620

    一、首先我先说一下这个解决方案是怎么运行的,给大家一个概念

    1.显示在哪 -> GLSurfaceVIew

    2.谁来把数据贴到GLSurfaceVIew上 -> Renderer

    3.谁来负责YUV数据转换成RGB -> GL中的Program/Shader

    一句话说明白就是:GL的Program/Shader把用户传过来的YUV数据,转换成RGB数据后,通过Renderer贴在GLSurfaceView上。

    二、怎么检查你的手机是不是支持GLES2.0呢,使用下面的代码段就行了:

    一般的手机,都是会支持GLES2.0的,大家不必担心。

    public static boolean detectOpenGLES20(Context context) {  
        ActivityManager am = (ActivityManager) context.getSystemService(Context.ACTIVITY_SERVICE);  
        ConfigurationInfo info = am.getDeviceConfigurationInfo();  
        return (info.reqGlEsVersion >= 0x20000);  
    }  

    三、开搞

    A 先要有一个GLSurfaceView,把它放入你的布局中就好了。

    找到这个家伙,对它进行简单的设置,并为它设置一个Renderer。

    Renderer的作用就是在GLSurfaceView上画出图像。

    mGLSurface = (GLFrameSurface) findViewById(R.id.glsurface);  
    mGLSurface.setEGLContextClientVersion(2);  
    mGLFRenderer = new GLFrameRenderer(this, mGLSurface);  
    mGLSurface.setRenderer(mGLFRenderer);  

    B 再就是看下GLFrameRenderer怎么来写了

     1 public class GLFrameRenderer implements Renderer {  
     2   
     3     private ISimplePlayer mParentAct; //请无视之  
     4     private GLSurfaceView mTargetSurface;  
     5     private GLProgram prog = new GLProgram(0);  
     6     private int mVideoWidth = -1, mVideoHeight = -1;  
     7     private ByteBuffer y;  
     8     private ByteBuffer u;  
     9     private ByteBuffer v;  
    10   
    11     public GLFrameRenderer(ISimplePlayer callback, GLSurfaceView surface) {  
    12         mParentAct = callback; //请无视之  
    13         mTargetSurface = surface;  
    14     }  
    15   
    16     @Override  
    17     public void onSurfaceCreated(GL10 gl, EGLConfig config) {  
    18         Utils.LOGD("GLFrameRenderer :: onSurfaceCreated");  
    19         if (!prog.isProgramBuilt()) {  
    20             prog.buildProgram();  
    21             Utils.LOGD("GLFrameRenderer :: buildProgram done");  
    22         }  
    23     }  
    24   
    25     @Override  
    26     public void onSurfaceChanged(GL10 gl, int width, int height) {  
    27         Utils.LOGD("GLFrameRenderer :: onSurfaceChanged");  
    28         GLES20.glViewport(0, 0, width, height);  
    29     }  
    30   
    31     @Override  
    32     public void onDrawFrame(GL10 gl) {  
    33         synchronized (this) {  
    34             if (y != null) {  
    35                 // reset position, have to be done  
    36                 y.position(0);  
    37                 u.position(0);  
    38                 v.position(0);  
    39                 prog.buildTextures(y, u, v, mVideoWidth, mVideoHeight);  
    40                 GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);  
    41                 GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);  
    42                 prog.drawFrame();  
    43             }  
    44         }  
    45     }  
    46   
    47     /** 
    48      * this method will be called from native code, it happens when the video is about to play or 
    49      * the video size changes. 
    50      */  
    51     public void update(int w, int h) {  
    52         Utils.LOGD("INIT E");  
    53         if (w > 0 && h > 0) {  
    54             if (w != mVideoWidth && h != mVideoHeight) {  
    55                 this.mVideoWidth = w;  
    56                 this.mVideoHeight = h;  
    57                 int yarraySize = w * h;  
    58                 int uvarraySize = yarraySize / 4;  
    59                 synchronized (this) {  
    60                     y = ByteBuffer.allocate(yarraySize);  
    61                     u = ByteBuffer.allocate(uvarraySize);  
    62                     v = ByteBuffer.allocate(uvarraySize);  
    63                 }  
    64             }  
    65         }  
    66   
    67         mParentAct.onPlayStart(); //请无视之  
    68         Utils.LOGD("INIT X");  
    69     }  
    70   
    71     /** 
    72      * this method will be called from native code, it's used for passing yuv data to me. 
    73      */  
    74     public void update(byte[] ydata, byte[] udata, byte[] vdata) {  
    75         synchronized (this) {  
    76             y.clear();  
    77             u.clear();  
    78             v.clear();  
    79             y.put(ydata, 0, ydata.length);  
    80             u.put(udata, 0, udata.length);  
    81             v.put(vdata, 0, vdata.length);  
    82         }  
    83   
    84         // request to render  
    85         mTargetSurface.requestRender();  
    86     }  
    87 }  
    View Code

    代码很简单,Renderer主要处理这么几个事:

    1.Surface create的时候,我初始化了一些需要用到的Program/Shader,因为马上就要用到它们了;

    2.Surface change的时候,重置一下画面;

    3.onDrawFrame()时,把数据真正地“画”上去;

    4.至于两个update方法,是用来把图像的宽高/数据传过来的。

    C 看GLProgram是怎么写的,它的作用是向Renderer提供计算单元,你所有对数据的处理,都在这儿了。

      1 public boolean isProgramBuilt() {  
      2     return isProgBuilt;  
      3 }  
      4   
      5 public void buildProgram() {  
      6     createBuffers(_vertices, coordVertices);  
      7     if (_program <= 0) {  
      8         _program = createProgram(VERTEX_SHADER, FRAGMENT_SHADER);  
      9     }  
     10     Utils.LOGD("_program = " + _program);  
     11   
     12     /* 
     13      * get handle for "vPosition" and "a_texCoord" 
     14      */  
     15     _positionHandle = GLES20.glGetAttribLocation(_program, "vPosition");  
     16     Utils.LOGD("_positionHandle = " + _positionHandle);  
     17     checkGlError("glGetAttribLocation vPosition");  
     18     if (_positionHandle == -1) {  
     19         throw new RuntimeException("Could not get attribute location for vPosition");  
     20     }  
     21     _coordHandle = GLES20.glGetAttribLocation(_program, "a_texCoord");  
     22     Utils.LOGD("_coordHandle = " + _coordHandle);  
     23     checkGlError("glGetAttribLocation a_texCoord");  
     24     if (_coordHandle == -1) {  
     25         throw new RuntimeException("Could not get attribute location for a_texCoord");  
     26     }  
     27   
     28     /* 
     29      * get uniform location for y/u/v, we pass data through these uniforms 
     30      */  
     31     _yhandle = GLES20.glGetUniformLocation(_program, "tex_y");  
     32     Utils.LOGD("_yhandle = " + _yhandle);  
     33     checkGlError("glGetUniformLocation tex_y");  
     34     if (_yhandle == -1) {  
     35         throw new RuntimeException("Could not get uniform location for tex_y");  
     36     }  
     37     _uhandle = GLES20.glGetUniformLocation(_program, "tex_u");  
     38     Utils.LOGD("_uhandle = " + _uhandle);  
     39     checkGlError("glGetUniformLocation tex_u");  
     40     if (_uhandle == -1) {  
     41         throw new RuntimeException("Could not get uniform location for tex_u");  
     42     }  
     43     _vhandle = GLES20.glGetUniformLocation(_program, "tex_v");  
     44     Utils.LOGD("_vhandle = " + _vhandle);  
     45     checkGlError("glGetUniformLocation tex_v");  
     46     if (_vhandle == -1) {  
     47         throw new RuntimeException("Could not get uniform location for tex_v");  
     48     }  
     49   
     50     isProgBuilt = true;  
     51 }  
     52   
     53 /** 
     54  * build a set of textures, one for Y, one for U, and one for V. 
     55  */  
     56 public void buildTextures(Buffer y, Buffer u, Buffer v, int width, int height) {  
     57     boolean videoSizeChanged = (width != _video_width || height != _video_height);  
     58     if (videoSizeChanged) {  
     59         _video_width = width;  
     60         _video_height = height;  
     61         Utils.LOGD("buildTextures videoSizeChanged: w=" + _video_width + " h=" + _video_height);  
     62     }  
     63   
     64     // building texture for Y data  
     65     if (_ytid < 0 || videoSizeChanged) {  
     66         if (_ytid >= 0) {  
     67             Utils.LOGD("glDeleteTextures Y");  
     68             GLES20.glDeleteTextures(1, new int[] { _ytid }, 0);  
     69             checkGlError("glDeleteTextures");  
     70         }  
     71         // GLES20.glPixelStorei(GLES20.GL_UNPACK_ALIGNMENT, 1);  
     72         int[] textures = new int[1];  
     73         GLES20.glGenTextures(1, textures, 0);  
     74         checkGlError("glGenTextures");  
     75         _ytid = textures[0];  
     76         Utils.LOGD("glGenTextures Y = " + _ytid);  
     77     }  
     78     GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _ytid);  
     79     checkGlError("glBindTexture");  
     80     GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, _video_width, _video_height, 0,  
     81             GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, y);  
     82     checkGlError("glTexImage2D");  
     83     GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);  
     84     GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);  
     85     GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);  
     86     GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);  
     87   
     88     // building texture for U data  
     89     if (_utid < 0 || videoSizeChanged) {  
     90         if (_utid >= 0) {  
     91             Utils.LOGD("glDeleteTextures U");  
     92             GLES20.glDeleteTextures(1, new int[] { _utid }, 0);  
     93             checkGlError("glDeleteTextures");  
     94         }  
     95         int[] textures = new int[1];  
     96         GLES20.glGenTextures(1, textures, 0);  
     97         checkGlError("glGenTextures");  
     98         _utid = textures[0];  
     99         Utils.LOGD("glGenTextures U = " + _utid);  
    100     }  
    101     GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _utid);  
    102     GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, _video_width / 2, _video_height / 2, 0,  
    103             GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, u);  
    104     GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);  
    105     GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);  
    106     GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);  
    107     GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);  
    108   
    109     // building texture for V data  
    110     if (_vtid < 0 || videoSizeChanged) {  
    111         if (_vtid >= 0) {  
    112             Utils.LOGD("glDeleteTextures V");  
    113             GLES20.glDeleteTextures(1, new int[] { _vtid }, 0);  
    114             checkGlError("glDeleteTextures");  
    115         }  
    116         int[] textures = new int[1];  
    117         GLES20.glGenTextures(1, textures, 0);  
    118         checkGlError("glGenTextures");  
    119         _vtid = textures[0];  
    120         Utils.LOGD("glGenTextures V = " + _vtid);  
    121     }  
    122     GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _vtid);  
    123     GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, _video_width / 2, _video_height / 2, 0,  
    124             GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, v);  
    125     GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);  
    126     GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);  
    127     GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);  
    128     GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);  
    129 }  
    130   
    131 /** 
    132  * render the frame 
    133  * the YUV data will be converted to RGB by shader. 
    134  */  
    135 public void drawFrame() {  
    136     GLES20.glUseProgram(_program);  
    137     checkGlError("glUseProgram");  
    138   
    139     GLES20.glVertexAttribPointer(_positionHandle, 2, GLES20.GL_FLOAT, false, 8, _vertice_buffer);  
    140     checkGlError("glVertexAttribPointer mPositionHandle");  
    141     GLES20.glEnableVertexAttribArray(_positionHandle);  
    142   
    143     GLES20.glVertexAttribPointer(_coordHandle, 2, GLES20.GL_FLOAT, false, 8, _coord_buffer);  
    144     checkGlError("glVertexAttribPointer maTextureHandle");  
    145     GLES20.glEnableVertexAttribArray(_coordHandle);  
    146   
    147     // bind textures  
    148     GLES20.glActiveTexture(_textureI);  
    149     GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _ytid);  
    150     GLES20.glUniform1i(_yhandle, _tIindex);  
    151   
    152     GLES20.glActiveTexture(_textureII);  
    153     GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _utid);  
    154     GLES20.glUniform1i(_uhandle, _tIIindex);  
    155   
    156     GLES20.glActiveTexture(_textureIII);  
    157     GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _vtid);  
    158     GLES20.glUniform1i(_vhandle, _tIIIindex);  
    159   
    160     GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);  
    161     GLES20.glFinish();  
    162   
    163     GLES20.glDisableVertexAttribArray(_positionHandle);  
    164     GLES20.glDisableVertexAttribArray(_coordHandle);  
    165 }  
    166   
    167 /** 
    168  * create program and load shaders, fragment shader is very important. 
    169  */  
    170 public int createProgram(String vertexSource, String fragmentSource) {  
    171     // create shaders  
    172     int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexSource);  
    173     int pixelShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentSource);  
    174     // just check  
    175     Utils.LOGD("vertexShader = " + vertexShader);  
    176     Utils.LOGD("pixelShader = " + pixelShader);  
    177   
    178     int program = GLES20.glCreateProgram();  
    179     if (program != 0) {  
    180         GLES20.glAttachShader(program, vertexShader);  
    181         checkGlError("glAttachShader");  
    182         GLES20.glAttachShader(program, pixelShader);  
    183         checkGlError("glAttachShader");  
    184         GLES20.glLinkProgram(program);  
    185         int[] linkStatus = new int[1];  
    186         GLES20.glGetProgramiv(program, GLES20.GL_LINK_STATUS, linkStatus, 0);  
    187         if (linkStatus[0] != GLES20.GL_TRUE) {  
    188             Utils.LOGE("Could not link program: ", null);  
    189             Utils.LOGE(GLES20.glGetProgramInfoLog(program), null);  
    190             GLES20.glDeleteProgram(program);  
    191             program = 0;  
    192         }  
    193     }  
    194     return program;  
    195 }  
    196   
    197 /** 
    198  * create shader with given source. 
    199  */  
    200 private int loadShader(int shaderType, String source) {  
    201     int shader = GLES20.glCreateShader(shaderType);  
    202     if (shader != 0) {  
    203         GLES20.glShaderSource(shader, source);  
    204         GLES20.glCompileShader(shader);  
    205         int[] compiled = new int[1];  
    206         GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compiled, 0);  
    207         if (compiled[0] == 0) {  
    208             Utils.LOGE("Could not compile shader " + shaderType + ":", null);  
    209             Utils.LOGE(GLES20.glGetShaderInfoLog(shader), null);  
    210             GLES20.glDeleteShader(shader);  
    211             shader = 0;  
    212         }  
    213     }  
    214     return shader;  
    215 }  
    216   
    217 /** 
    218  * these two buffers are used for holding vertices, screen vertices and texture vertices. 
    219  */  
    220 private void createBuffers(float[] vert, float[] coord) {  
    221     _vertice_buffer = ByteBuffer.allocateDirect(vert.length * 4);  
    222     _vertice_buffer.order(ByteOrder.nativeOrder());  
    223     _vertice_buffer.asFloatBuffer().put(vert);  
    224     _vertice_buffer.position(0);  
    225   
    226     if (_coord_buffer == null) {  
    227         _coord_buffer = ByteBuffer.allocateDirect(coord.length * 4);  
    228         _coord_buffer.order(ByteOrder.nativeOrder());  
    229         _coord_buffer.asFloatBuffer().put(coord);  
    230         _coord_buffer.position(0);  
    231     }  
    232 }  
    233   
    234 private void checkGlError(String op) {  
    235     int error;  
    236     while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {  
    237         Utils.LOGE("***** " + op + ": glError " + error, null);  
    238         throw new RuntimeException(op + ": glError " + error);  
    239     }  
    240 }  
    241   
    242 private static float[] squareVertices = { -1.0f, -1.0f, 1.0f, -1.0f, -1.0f, 1.0f, 1.0f, 1.0f, }; // fullscreen  
    243   
    244 private static float[] coordVertices = { 0.0f, 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, };// whole-texture  
    245   
    246 private static final String VERTEX_SHADER = "attribute vec4 vPosition;
    " + "attribute vec2 a_texCoord;
    "  
    247         + "varying vec2 tc;
    " + "void main() {
    " + "gl_Position = vPosition;
    " + "tc = a_texCoord;
    " + "}
    ";  
    248   
    249 private static final String FRAGMENT_SHADER = "precision mediump float;
    " + "uniform sampler2D tex_y;
    "  
    250         + "uniform sampler2D tex_u;
    " + "uniform sampler2D tex_v;
    " + "varying vec2 tc;
    " + "void main() {
    "  
    251         + "vec4 c = vec4((texture2D(tex_y, tc).r - 16./255.) * 1.164);
    "  
    252         + "vec4 U = vec4(texture2D(tex_u, tc).r - 128./255.);
    "  
    253         + "vec4 V = vec4(texture2D(tex_v, tc).r - 128./255.);
    " + "c += V * vec4(1.596, -0.813, 0, 0);
    "  
    254         + "c += U * vec4(0, -0.392, 2.017, 0);
    " + "c.a = 1.0;
    " + "gl_FragColor = c;
    " + "}
    ";  
    255  
    View Code

    这里面代码比较复杂,我在这里稍作解释:

    1.首先,buildProgram()目的要生成一个program,作用是用来将YUV->RGB,其中用到了2个shader(shader就相当于一个小运算器,它运行一段代码),第1个shader运行VERTEX_SHADER里的代码,目的是将坐标作为参数传入第2个shader;第2个shader来做YUV->RGB的运算。

    2.buildTextures()是要生成3个贴图,分别为了显示R/G/B数据,三个贴图重合在一起,显示出来的就是彩色的图片。

    3.drawFrame()是使用program来做运算,并真正去做画这个动作了。

    至此,就可以将YUV图片也好,视频也可,给显示在Android上了,而且速度不慢哦!希望能帮到大家。

    相关代码下载链接:

    http://download.csdn.net/detail/ueryueryuery/7144851

    本文来自:http://blog.csdn.net/ueryueryuery/article/details/17608185#comments

  • 相关阅读:
    写作的益处
    【转载】德鲁克:激发我一生的七段经历
    PS如何删除灰色的自动切片
    其他经验博文分类链接
    LODOP单个简短问答(小页面无需拖动滚动条)
    LODOP导出excel的页眉页脚
    LODOP导出和写入excel测试
    LODOP导出Excel简短问答和相关博文
    Lodop导出excel带数字格式
    LODOP批量打印判断是否加入队列1
  • 原文地址:https://www.cnblogs.com/Sharley/p/5947500.html
Copyright © 2011-2022 走看看