Android Camera開發:使用GLSurfaceView預覽Camera 基礎拍照


GLSurfaceView是OpenGL中的一個類,也是可以預覽Camera的,而且在預覽Camera上有其獨到之處。獨到之處在哪?當使用Surfaceview無能為力、痛不欲生時就只有使用GLSurfaceView了,它能夠真正做到讓Camera的數據和顯示分離,所以搞明白了這個,像Camera只開預覽不顯示這都是小菜,妥妥的。Android4.0的自帶Camera源碼是用SurfaceView預覽的,但到了4.2就換成了GLSurfaceView來預覽。如今到了4.4又用了自家的TextureView,所以從中可以窺探出新增TextureView的用意。

雖說Android4.2的Camera源碼是用GLSurfaceView預覽的,但是進行了大量的封裝又封裝的,由於是OpenGL小白,真是看的不知所雲。俺滴要求不高,只想弄個可拍照的摸清GLSurfaceView在預覽Camera上的使用流程。經過一番百度一無所獲,后來翻出去Google一大圈也沒發現可用的。倒是很多人都在用GLSurfaceView和Surfaceview同時預覽Camera,Surfaceview用來預覽數據,在上面又鋪了一層GLSurfaceView繪制一些信息。無奈自己摸索,整出來的是能拍照也能得到數據,但是界面上不是一塊白板就是一塊黑板啥都不顯示。后來在stackoverflow終於找到了一個可用的鏈接,,蒼天啊,終於柳暗花明了!參考此鏈接,自己又改改摸索了一天才徹底搞定。之所以費這么多時間是不明白OpenGL ES2.0的繪制基本流程,跟簡單的OpenGL的繪制還是稍有區別。下面上源碼:

一、CameraGLSurfaceView.java 此類繼承GLSurfaceView,並實現了兩個接口

 

 1 package org.yanzi.camera.preview;
 2 
 3 import javax.microedition.khronos.egl.EGLConfig;
 4 import javax.microedition.khronos.opengles.GL10;
 5 
 6 import org.yanzi.camera.CameraInterface;
 7 
 8 import android.content.Context;
 9 import android.graphics.SurfaceTexture;
10 import android.opengl.GLES11Ext;
11 import android.opengl.GLES20;
12 import android.opengl.GLSurfaceView;
13 import android.opengl.GLSurfaceView.Renderer;
14 import android.util.AttributeSet;
15 import android.util.Log;
16 
17 public class CameraGLSurfaceView extends GLSurfaceView implements Renderer, SurfaceTexture.OnFrameAvailableListener {
18     private static final String TAG = "yanzi";
19     Context mContext;
20     SurfaceTexture mSurface;
21     int mTextureID = -1;
22     DirectDrawer mDirectDrawer;
23     public CameraGLSurfaceView(Context context, AttributeSet attrs) {
24         super(context, attrs);
25         // TODO Auto-generated constructor stub
26         mContext = context;
27         setEGLContextClientVersion(2);
28         setRenderer(this);
29         setRenderMode(RENDERMODE_WHEN_DIRTY);
30     }
31     @Override
32     public void onSurfaceCreated(GL10 gl, EGLConfig config) {
33         // TODO Auto-generated method stub
34         Log.i(TAG, "onSurfaceCreated...");
35         mTextureID = createTextureID();
36         mSurface = new SurfaceTexture(mTextureID);
37         mSurface.setOnFrameAvailableListener(this);
38         mDirectDrawer = new DirectDrawer(mTextureID);
39         CameraInterface.getInstance().doOpenCamera(null);
40 
41     }
42     @Override
43     public void onSurfaceChanged(GL10 gl, int width, int height) {
44         // TODO Auto-generated method stub
45         Log.i(TAG, "onSurfaceChanged...");
46         GLES20.glViewport(0, 0, width, height);
47         if(!CameraInterface.getInstance().isPreviewing()){
48             CameraInterface.getInstance().doStartPreview(mSurface, 1.33f);
49         }
50     
51 
52     }
53     @Override
54     public void onDrawFrame(GL10 gl) {
55         // TODO Auto-generated method stub
56         Log.i(TAG, "onDrawFrame...");
57         GLES20.glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
58         GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
59         mSurface.updateTexImage();
60         float[] mtx = new float[16];
61         mSurface.getTransformMatrix(mtx);
62         mDirectDrawer.draw(mtx);
63     }
64     
65     @Override
66     public void onPause() {
67         // TODO Auto-generated method stub
68         super.onPause();
69         CameraInterface.getInstance().doStopCamera();
70     }
71     private int createTextureID()
72     {
73         int[] texture = new int[1];
74 
75         GLES20.glGenTextures(1, texture, 0);
76         GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture[0]);
77         GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
78                 GL10.GL_TEXTURE_MIN_FILTER,GL10.GL_LINEAR);        
79         GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
80                 GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
81         GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
82                 GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
83         GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
84                 GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
85 
86         return texture[0];
87     }
88     public SurfaceTexture _getSurfaceTexture(){
89         return mSurface;
90     }
91     @Override
92     public void onFrameAvailable(SurfaceTexture surfaceTexture) {
93         // TODO Auto-generated method stub
94         Log.i(TAG, "onFrameAvailable...");
95         this.requestRender();
96     }
97 
98 }

 

關於這個類進行簡單說明:

 

1、Renderer這個接口里有三個回調: onSurfaceCreated() onSurfaceChanged() onDrawFrame(),在onSurfaceCreated里設置了GLSurfaceView的版本: setEGLContextClientVersion(2); 如果沒這個設置是啥都畫不出來了,因為Android支持OpenGL ES1.1和2.0及最新的3.0,而且版本間差別很大。不告訴他版本他不知道用哪個版本的api渲染。在設置setRenderer(this);后,再設置它的模式為RENDERMODE_WHEN_DIRTY。這個也很關鍵,看api:

 

When renderMode is RENDERMODE_CONTINUOUSLY, the renderer is called repeatedly to re-render the scene. When renderMode is RENDERMODE_WHEN_DIRTY, the renderer only rendered when the surface is created, or when requestRender is called. Defaults to RENDERMODE_CONTINUOUSLY.

Using RENDERMODE_WHEN_DIRTY can improve battery life and overall system performance by allowing the GPU and CPU to idle when the view does not need to be updated. 

大意是RENDERMODE_CONTINUOUSLY模式就會一直Render,如果設置成RENDERMODE_WHEN_DIRTY,就是當有數據時才rendered或者主動調用了GLSurfaceView的requestRender.默認是連續模式,很顯然Camera適合臟模式,一秒30幀,當有數據來時再渲染。

2、正因是RENDERMODE_WHEN_DIRTY所以就要告訴GLSurfaceView什么時候Render,也就是啥時候進到onDrawFrame()這個函數里。SurfaceTexture.OnFrameAvailableListener這個接口就干了這么一件事,當有數據上來后會進到

public void onFrameAvailable(SurfaceTexture surfaceTexture) {
// TODO Auto-generated method stub
Log.i(TAG, "onFrameAvailable...");
this.requestRender();
}

 

這里,然后執行requestRender()。

3、網上有一些OpenGL ES的示例是在Activity里實現了SurfaceTexture.OnFrameAvailableListener此接口,其實這個無所謂。無論是被誰實現,關鍵看在回調里干了什么事。

4、與TextureView里對比可知,TextureView預覽時因為實現了SurfaceTextureListener會自動創建SurfaceTexture。但在GLSurfaceView里則要手動創建同時綁定一個紋理ID。

5、本文在onSurfaceCreated()里打開Camera,在onSurfaceChanged()里開啟預覽,默認1.33的比例。原因是相比前兩種預覽,此處SurfaceTexture創建需要一定時間。如果想要開預覽時由Activity發起,則要GLSurfaceView利用Handler將創建的SurfaceTexture傳遞給Activity。

 

二、DirectDrawer.java 此類非常關鍵,負責將SurfaceTexture內容繪制到屏幕上

 

  1 package org.yanzi.camera.preview;
  2 
  3 import java.nio.ByteBuffer;
  4 import java.nio.ByteOrder;
  5 import java.nio.FloatBuffer;
  6 import java.nio.ShortBuffer;
  7 
  8 import android.opengl.GLES11Ext;
  9 import android.opengl.GLES20;
 10 import android.opengl.Matrix;
 11 
 12 public class DirectDrawer {
 13     private final String vertexShaderCode =
 14             "attribute vec4 vPosition;" +
 15             "attribute vec2 inputTextureCoordinate;" +
 16             "varying vec2 textureCoordinate;" +
 17             "void main()" +
 18             "{"+
 19                 "gl_Position = vPosition;"+
 20                 "textureCoordinate = inputTextureCoordinate;" +
 21             "}";
 22 
 23     private final String fragmentShaderCode =
 24             "#extension GL_OES_EGL_image_external : require\n"+
 25             "precision mediump float;" +
 26             "varying vec2 textureCoordinate;\n" +
 27             "uniform samplerExternalOES s_texture;\n" +
 28             "void main() {" +
 29             "  gl_FragColor = texture2D( s_texture, textureCoordinate );\n" +
 30             "}";
 31 
 32     private FloatBuffer vertexBuffer, textureVerticesBuffer;
 33     private ShortBuffer drawListBuffer;
 34     private final int mProgram;
 35     private int mPositionHandle;
 36     private int mTextureCoordHandle;
 37 
 38     private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; // order to draw vertices
 39 
 40     // number of coordinates per vertex in this array
 41     private static final int COORDS_PER_VERTEX = 2;
 42 
 43     private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
 44 
 45     static float squareCoords[] = {
 46        -1.0f,  1.0f,
 47        -1.0f, -1.0f,
 48         1.0f, -1.0f,
 49         1.0f,  1.0f,
 50     };
 51 
 52     static float textureVertices[] = {
 53         0.0f, 1.0f,
 54         1.0f, 1.0f,
 55         1.0f, 0.0f,
 56         0.0f, 0.0f,
 57     };
 58 
 59     private int texture;
 60 
 61     public DirectDrawer(int texture)
 62     {
 63         this.texture = texture;
 64         // initialize vertex byte buffer for shape coordinates
 65         ByteBuffer bb = ByteBuffer.allocateDirect(squareCoords.length * 4);
 66         bb.order(ByteOrder.nativeOrder());
 67         vertexBuffer = bb.asFloatBuffer();
 68         vertexBuffer.put(squareCoords);
 69         vertexBuffer.position(0);
 70 
 71         // initialize byte buffer for the draw list
 72         ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2);
 73         dlb.order(ByteOrder.nativeOrder());
 74         drawListBuffer = dlb.asShortBuffer();
 75         drawListBuffer.put(drawOrder);
 76         drawListBuffer.position(0);
 77 
 78         ByteBuffer bb2 = ByteBuffer.allocateDirect(textureVertices.length * 4);
 79         bb2.order(ByteOrder.nativeOrder());
 80         textureVerticesBuffer = bb2.asFloatBuffer();
 81         textureVerticesBuffer.put(textureVertices);
 82         textureVerticesBuffer.position(0);
 83 
 84         int vertexShader    = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
 85         int fragmentShader  = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
 86 
 87         mProgram = GLES20.glCreateProgram();             // create empty OpenGL ES Program
 88         GLES20.glAttachShader(mProgram, vertexShader);   // add the vertex shader to program
 89         GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
 90         GLES20.glLinkProgram(mProgram);                  // creates OpenGL ES program executables
 91     }
 92 
 93     public void draw(float[] mtx)
 94     {
 95         GLES20.glUseProgram(mProgram);
 96 
 97         GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
 98         GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture);
 99 
100         // get handle to vertex shader's vPosition member
101         mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
102 
103         // Enable a handle to the triangle vertices
104         GLES20.glEnableVertexAttribArray(mPositionHandle);
105 
106         // Prepare the <insert shape here> coordinate data
107         GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer);
108 
109         mTextureCoordHandle = GLES20.glGetAttribLocation(mProgram, "inputTextureCoordinate");
110         GLES20.glEnableVertexAttribArray(mTextureCoordHandle);
111         
112 //        textureVerticesBuffer.clear();
113 //        textureVerticesBuffer.put( transformTextureCoordinates( textureVertices, mtx ));
114 //        textureVerticesBuffer.position(0);
115         GLES20.glVertexAttribPointer(mTextureCoordHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, textureVerticesBuffer);
116 
117         GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
118 
119         // Disable vertex array
120         GLES20.glDisableVertexAttribArray(mPositionHandle);
121         GLES20.glDisableVertexAttribArray(mTextureCoordHandle);
122     }
123     
124     private  int loadShader(int type, String shaderCode){
125 
126         // create a vertex shader type (GLES20.GL_VERTEX_SHADER)
127         // or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
128         int shader = GLES20.glCreateShader(type);
129 
130         // add the source code to the shader and compile it
131         GLES20.glShaderSource(shader, shaderCode);
132         GLES20.glCompileShader(shader);
133 
134         return shader;
135     }
136     private float[] transformTextureCoordinates( float[] coords, float[] matrix)
137     {          
138        float[] result = new float[ coords.length ];        
139        float[] vt = new float[4];      
140 
141        for ( int i = 0 ; i < coords.length ; i += 2 ) {
142            float[] v = { coords[i], coords[i+1], 0 , 1  };
143            Matrix.multiplyMV(vt, 0, matrix, 0, v, 0);
144            result[i] = vt[0];
145            result[i+1] = vt[1];
146        }
147        return result;
148     }
149 }

 


三、有了上面兩個類就完成95%的工作,可以將GLSurfaceView看成是有生命周期的。在onPause里進行關閉Camera,在Activity里復寫兩個方法:

       @Override
    protected void onResume() {
        // TODO Auto-generated method stub
        super.onResume();
        glSurfaceView.bringToFront();
    }

    @Override
    protected void onPause() {
        // TODO Auto-generated method stub
        super.onPause();
        glSurfaceView.onPause();
    }    

 

這個glSurfaceView.bringToFront();其實不寫也中。在布局里寫入自定義的GLSurfaceView就ok了:

    <FrameLayout

        android:layout_width="wrap_content"
        android:layout_height="wrap_content" >
        <org.yanzi.camera.preview.CameraGLSurfaceView
            android:id="@+id/camera_textureview"
            android:layout_width="0dip"
            android:layout_height="0dip" />
    </FrameLayout>

 

CameraActivity里只負責UI部分,CameraGLSurfaceView負責開Camera、預覽,並調用DirectDrawer里的draw()進行繪制。其他代碼就不上了。

 

注意事項:

1、在onDrawFrame()里,如果不調用mDirectDrawer.draw(mtx);是啥都顯示不出來的!!!這是GLSurfaceView的特別之處。為啥呢?因為GLSurfaceView不是Android親生的,而Surfaceview和TextureView是。所以得自己按照OpenGL ES的流程畫。

2、究竟mDirectDrawer.draw(mtx)里在哪獲取的Buffer目前雜家還么看太明白,貌似么有請求buffer,而是根據GLSurfaceView里創建的SurfaceTexture之前,生成的有個紋理ID。這個紋理ID一方面跟SurfaceTexture是綁定在一起的,另一方面跟DirectDrawer綁定,而SurfaceTexture作渲染載體。

3、參考鏈接里有,有人為了解決問題,給出了下面三段代碼:

 

@Override
public void onDrawFrame(GL10 gl)
{
    float[] mtx = new float[16];
    mSurface.updateTexImage();
    mSurface.getTransformMatrix(mtx);    

    mDirectVideo.draw(mtx);
}
 private float[] transformTextureCoordinates( float[] coords, float[] matrix)
 {          
    float[] result = new float[ coords.length ];        
    float[] vt = new float[4];      

    for ( int i = 0 ; i < coords.length ; i += 2 ) {
        float[] v = { coords[i], coords[i+1], 0 , 1  };
        Matrix.multiplyMV(vt, 0, matrix, 0, v, 0);
        result[i] = vt[0];
        result[i+1] = vt[1];
    }
    return result;
 }
textureVerticesBuffer.clear();
textureVerticesBuffer.put( transformTextureCoordinates( textureVertices, mtx ));
textureVerticesBuffer.position(0);

 

我已經把代碼都融入到了此demo,只不過在draw()方法里么有使用。原因是使用之后,得到的預覽畫面反而是變形的,而不用的話是ok的。上面的代碼是得到SurfaceTexture的變換矩陣:mSurface.getTransformMatrix

 

然后將此矩陣傳遞給draw(),在draw的時候對textureVerticesBuffer作一個變化,然后再畫。

下圖是未加這個矩陣變換效果時:

玩轉Android Camera開發(三):國內首發使用GLSurfaceView預覽Camera 基礎拍照demo

下圖為使用了變換矩陣,划片扭曲的還真說不上來咋扭曲的,但足以說明OpenGL ES在渲染效果上的強大,就是設置了個矩陣,不用一幀幀處理,就能得到不一樣顯示效果。

玩轉Android Camera開發(三):國內首發使用GLSurfaceView預覽Camera 基礎拍照demo

 

 

 

-----------------------------本文系原創,轉載請注明作者yanzi1225627

版本號:PlayCamera_V3.0.0[2014-6-22].zip

CSDN下載鏈接:http://download.csdn.net/detail/yanzi1225627/7547263

百度雲盤:

附個OpenGL ES簡明教程:http://www.apkbus.com/android-20427-1-1.html


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM