700字范文,内容丰富有趣,生活中的好帮手!
700字范文 > [转载]玩转Android Camera开发(三):国内首发---使用GLSurfaceView预览Camera 基础拍照demo...

[转载]玩转Android Camera开发(三):国内首发---使用GLSurfaceView预览Camera 基础拍照demo...

时间:2021-08-28 11:51:39

相关推荐

[转载]玩转Android Camera开发(三):国内首发---使用GLSurfaceView预览Camera 基础拍照demo...

GLSurfaceView 是OpenGL中的一个类,也是可以预览Camera的,而且在预览Camera上有其独到之处。独到之处在哪?当使用Surfaceview无能为力、 痛不欲生时就只有使用GLSurfaceView了,它能够真正做到让Camera的数据和显示分离,所以搞明白了这个,像Camera只开预览不显示这 都是小菜,妥妥的。Android4.0的自带Camera源码是用SurfaceView预览的,但到了4.2就换成了GLSurfaceView来预 览。如今到了4.4又用了自家的TextureView,所以从中可以窥探出新增TextureView的用意。

虽 说Android4.2的Camera源码是用GLSurfaceView预览的,但是进行了大量的封装又封装的,由于是OpenGL小白,真是看的不知 所云。俺滴要求不高,只想弄个可拍照的摸清GLSurfaceView在预览Camera上的使用流程。经过一番百度一无所获,后来翻出去Google一 大圈也没发现可用的。倒是很多人都在用GLSurfaceView和Surfaceview同时预览Camera,Surfaceview用来预览数据, 在上面又铺了一层GLSurfaceView绘制一些信息。无奈自己摸索,整出来的是能拍照也能得到数据,但是界面上不是一块白板就是一块黑板啥都不显 示。后来在stackoverflow终于找到了一个可用的链接,哈哈,苍天啊,终于柳暗花明了!参考此链接,自己又改改摸索了一天才彻底搞定。之所以费这么多时间是不明白OpenGL ES2.0的绘制基本流程,跟简单的OpenGL的绘制还是稍有区别。下面上源码:

一、CameraGLSurfaceView.java 此类继承GLSurfaceView,并实现了两个接口

[java]view plaincopyprint? <spanstyle="font-family:ComicSansMS;font-size:18px;">packageorg.yanzi.camera.preview;importjavax.microedition.khronos.egl.EGLConfig;importjavax.microedition.khronos.opengles.GL10;importorg.yanzi.camera.CameraInterface;importandroid.content.Context;importandroid.graphics.SurfaceTexture;importandroid.opengl.GLES11Ext;importandroid.opengl.GLES20;importandroid.opengl.GLSurfaceView;importandroid.opengl.GLSurfaceView.Renderer;importandroid.util.AttributeSet;importandroid.util.Log;publicclassCameraGLSurfaceViewextendsGLSurfaceViewimplementsRenderer,SurfaceTexture.OnFrameAvailableListener{privatestaticfinalStringTAG="yanzi";ContextmContext;SurfaceTexturemSurface;intmTextureID=-1;DirectDrawermDirectDrawer;publicCameraGLSurfaceView(Contextcontext,AttributeSetattrs){super(context,attrs);//TODOAuto-generatedconstructorstubmContext=context;setEGLContextClientVersion(2);setRenderer(this);setRenderMode(RENDERMODE_WHEN_DIRTY);}@OverridepublicvoidonSurfaceCreated(GL10gl,EGLConfigconfig){//TODOAuto-generatedmethodstubLog.i(TAG,"onSurfaceCreated...");mTextureID=createTextureID();mSurface=newSurfaceTexture(mTextureID);mSurface.setOnFrameAvailableListener(this);mDirectDrawer=newDirectDrawer(mTextureID);CameraInterface.getInstance().doOpenCamera(null);}@OverridepublicvoidonSurfaceChanged(GL10gl,intwidth,intheight){//TODOAuto-generatedmethodstubLog.i(TAG,"onSurfaceChanged...");GLES20.glViewport(0,0,width,height);if(!CameraInterface.getInstance().isPreviewing()){CameraInterface.getInstance().doStartPreview(mSurface,1.33f);}}@OverridepublicvoidonDrawFrame(GL10gl){//TODOAuto-generatedmethodstubLog.i(TAG,"onDrawFrame...");GLES20.glClearColor(1.0f,1.0f,1.0f,1.0f);GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT|GLES20.GL_DEPTH_BUFFER_BIT);mSurface.updateTexImage();float[]mtx=newfloat[16];mSurface.getTransformMatrix(mtx);mDirectDrawer.draw(mtx);}@OverridepublicvoidonPause(){//TODOAuto-generatedmethodstubsuper.onPause();CameraInterface.getInstance().doStopCamera();}privateintcreateTextureID(){int[]texture=newint[1];GLES20.glGenTextures(1,texture,0);GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,texture[0]);GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,GL10.GL_TEXTURE_MIN_FILTER,GL10.GL_LINEAR);GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,GL10.GL_TEXTURE_MAG_FILTER,GL10.GL_LINEAR);GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,GL10.GL_TEXTURE_WRAP_S,GL10.GL_CLAMP_TO_EDGE);GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,GL10.GL_TEXTURE_WRAP_T,GL10.GL_CLAMP_TO_EDGE);returntexture[0];}publicSurfaceTexture_getSurfaceTexture(){returnmSurface;}@OverridepublicvoidonFrameAvailable(SurfaceTexturesurfaceTexture){//TODOAuto-generatedmethodstubLog.i(TAG,"onFrameAvailable...");this.requestRender();}}</span>

关于这个类进行简单说明:

1、 Renderer这个接口里有三个回调:onSurfaceCreated()onSurfaceChanged()onDrawFrame(), 在onSurfaceCreated里设置了GLSurfaceView的版本:setEGLContextClientVersion(2); 如果没这个设置是啥都画不出来了,因为Android支持OpenGL ES1.1和2.0及最新的3.0,而且版本间差别很大。不告诉他版本他不知道用哪个版本的api渲染。在设置setRenderer(this);后, 再设置它的模式为RENDERMODE_WHEN_DIRTY。这个也很关键,看api:

When renderMode is RENDERMODE_CONTINUOUSLY, the renderer is called repeatedly to re-render the scene. When renderMode is RENDERMODE_WHEN_DIRTY, the renderer only rendered when the surface is created, or whenrequestRenderis called. Defaults to RENDERMODE_CONTINUOUSLY.

Using RENDERMODE_WHEN_DIRTY can improve battery life and overall system performance by allowing the GPU and CPU to idle when the view does not need to be updated.

大 意是RENDERMODE_CONTINUOUSLY模式就会一直Render,如果设置成RENDERMODE_WHEN_DIRTY,就是当有数据时 才rendered或者主动调用了GLSurfaceView的requestRender.默认是连续模式,很显然Camera适合脏模式,一秒30 帧,当有数据来时再渲染。

2、 正因是RENDERMODE_WHEN_DIRTY所以就要告诉GLSurfaceView什么时候Render,也就是啥时候进到 onDrawFrame()这个函数里。SurfaceTexture.OnFrameAvailableListener这个接口就干了这么一件事,当 有数据上来后会进到

public void onFrameAvailable(SurfaceTexture surfaceTexture) {

// TODO Auto-generated method stub

Log.i(TAG, "onFrameAvailable...");

this.requestRender();

}

这里,然后执行requestRender()。

3、网上有一些OpenGL ES的示例是在Activity里实现了SurfaceTexture.OnFrameAvailableListener此接口,其实这个无所谓。无论是被谁实现,关键看在回调里干了什么事。

4、与TextureView里对比可知,TextureView预览时因为实现了SurfaceTextureListener会自动创建SurfaceTexture。但在GLSurfaceView里则要手动创建同时绑定一个纹理ID。

5、 本文在onSurfaceCreated()里打开Camera,在onSurfaceChanged()里开启预览,默认1.33的比例。原因是相比前 两种预览,此处SurfaceTexture创建需要一定时间。如果想要开预览时由Activity发起,则要GLSurfaceView利用 Handler将创建的SurfaceTexture传递给Activity。

二、DirectDrawer.java 此类非常关键,负责将SurfaceTexture内容绘制到屏幕上

[java]view plaincopyprint? <spanstyle="font-family:ComicSansMS;font-size:18px;">packageorg.yanzi.camera.preview;importjava.nio.ByteBuffer;importjava.nio.ByteOrder;importjava.nio.FloatBuffer;importjava.nio.ShortBuffer;importandroid.opengl.GLES11Ext;importandroid.opengl.GLES20;importandroid.opengl.Matrix;publicclassDirectDrawer{privatefinalStringvertexShaderCode="attributevec4vPosition;"+"attributevec2inputTextureCoordinate;"+"varyingvec2textureCoordinate;"+"voidmain()"+"{"+"gl_Position=vPosition;"+"textureCoordinate=inputTextureCoordinate;"+"}";privatefinalStringfragmentShaderCode="#extensionGL_OES_EGL_image_external:require\n"+"precisionmediumpfloat;"+"varyingvec2textureCoordinate;\n"+"uniformsamplerExternalOESs_texture;\n"+"voidmain(){"+"gl_FragColor=texture2D(s_texture,textureCoordinate);\n"+"}";privateFloatBuffervertexBuffer,textureVerticesBuffer;privateShortBufferdrawListBuffer;privatefinalintmProgram;privateintmPositionHandle;privateintmTextureCoordHandle;privateshortdrawOrder[]={0,1,2,0,2,3};//ordertodrawvertices//numberofcoordinatespervertexinthisarrayprivatestaticfinalintCOORDS_PER_VERTEX=2;privatefinalintvertexStride=COORDS_PER_VERTEX*4;//4bytespervertexstaticfloatsquareCoords[]={-1.0f,1.0f,-1.0f,-1.0f,1.0f,-1.0f,1.0f,1.0f,};staticfloattextureVertices[]={0.0f,1.0f,1.0f,1.0f,1.0f,0.0f,0.0f,0.0f,};privateinttexture;publicDirectDrawer(inttexture){this.texture=texture;//initializevertexbytebufferforshapecoordinatesByteBufferbb=ByteBuffer.allocateDirect(squareCoords.length*4);bb.order(ByteOrder.nativeOrder());vertexBuffer=bb.asFloatBuffer();vertexBuffer.put(squareCoords);vertexBuffer.position(0);//initializebytebufferforthedrawlistByteBufferdlb=ByteBuffer.allocateDirect(drawOrder.length*2);dlb.order(ByteOrder.nativeOrder());drawListBuffer=dlb.asShortBuffer();drawListBuffer.put(drawOrder);drawListBuffer.position(0);ByteBufferbb2=ByteBuffer.allocateDirect(textureVertices.length*4);bb2.order(ByteOrder.nativeOrder());textureVerticesBuffer=bb2.asFloatBuffer();textureVerticesBuffer.put(textureVertices);textureVerticesBuffer.position(0);intvertexShader=loadShader(GLES20.GL_VERTEX_SHADER,vertexShaderCode);intfragmentShader=loadShader(GLES20.GL_FRAGMENT_SHADER,fragmentShaderCode);mProgram=GLES20.glCreateProgram();//createemptyOpenGLESProgramGLES20.glAttachShader(mProgram,vertexShader);//addthevertexshadertoprogramGLES20.glAttachShader(mProgram,fragmentShader);//addthefragmentshadertoprogramGLES20.glLinkProgram(mProgram);//createsOpenGLESprogramexecutables}publicvoiddraw(float[]mtx){GLES20.glUseProgram(mProgram);GLES20.glActiveTexture(GLES20.GL_TEXTURE0);GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,texture);//gethandletovertexshader'svPositionmembermPositionHandle=GLES20.glGetAttribLocation(mProgram,"vPosition");//EnableahandletothetriangleverticesGLES20.glEnableVertexAttribArray(mPositionHandle);//Preparethe<insertshapehere>coordinatedataGLES20.glVertexAttribPointer(mPositionHandle,COORDS_PER_VERTEX,GLES20.GL_FLOAT,false,vertexStride,vertexBuffer);mTextureCoordHandle=GLES20.glGetAttribLocation(mProgram,"inputTextureCoordinate");GLES20.glEnableVertexAttribArray(mTextureCoordHandle);//textureVerticesBuffer.clear();//textureVerticesBuffer.put(transformTextureCoordinates(textureVertices,mtx));//textureVerticesBuffer.position(0);GLES20.glVertexAttribPointer(mTextureCoordHandle,COORDS_PER_VERTEX,GLES20.GL_FLOAT,false,vertexStride,textureVerticesBuffer);GLES20.glDrawElements(GLES20.GL_TRIANGLES,drawOrder.length,GLES20.GL_UNSIGNED_SHORT,drawListBuffer);//DisablevertexarrayGLES20.glDisableVertexAttribArray(mPositionHandle);GLES20.glDisableVertexAttribArray(mTextureCoordHandle);}privateintloadShader(inttype,StringshaderCode){//createavertexshadertype(GLES20.GL_VERTEX_SHADER)//orafragmentshadertype(GLES20.GL_FRAGMENT_SHADER)intshader=GLES20.glCreateShader(type);//addthesourcecodetotheshaderandcompileitGLES20.glShaderSource(shader,shaderCode);GLES20.glCompileShader(shader);returnshader;}privatefloat[]transformTextureCoordinates(float[]coords,float[]matrix){float[]result=newfloat[coords.length];float[]vt=newfloat[4];for(inti=0;i<coords.length;i+=2){float[]v={coords[i],coords[i+1],0,1};Matrix.multiplyMV(vt,0,matrix,0,v,0);result[i]=vt[0];result[i+1]=vt[1];}returnresult;}}</span>

三、有了上面两个类就完成95%的工作,可以将GLSurfaceView看成是有生命周期的。在onPause里进行关闭Camera,在Activity里复写两个方法:

[java]view plaincopyprint? <spanstyle="font-family:ComicSansMS;font-size:18px;">@OverrideprotectedvoidonResume(){//TODOAuto-generatedmethodstubsuper.onResume();glSurfaceView.bringToFront();}@OverrideprotectedvoidonPause(){//TODOAuto-generatedmethodstubsuper.onPause();glSurfaceView.onPause();}</span>

这个glSurfaceView.bringToFront();其实不写也中。在布局里写入自定义的GLSurfaceView就ok了:

[html]view plaincopyprint? <spanstyle="font-family:ComicSansMS;font-size:18px;"><FrameLayoutandroid:layout_width="wrap_content"android:layout_height="wrap_content"><org.yanzi.camera.preview.CameraGLSurfaceViewandroid:id="@+id/camera_textureview"android:layout_width="0dip"android:layout_height="0dip"/></FrameLayout></span>

CameraActivity里只负责UI部分,CameraGLSurfaceView负责开Camera、预览,并调用DirectDrawer里的draw()进行绘制。其他代码就不上了。

注意事项:

1、在onDrawFrame()里,如果不调用mDirectDrawer.draw(mtx);是啥都显示不出来的!!!这是GLSurfaceView的特别之处。为啥呢?因为GLSurfaceView不是Android亲生的,而Surfaceview和TextureView是。所以得自己按照OpenGL ES的流程画。

2、 究竟mDirectDrawer.draw(mtx)里在哪获取的Buffer目前杂家还么看太明白,貌似么有请求buffer,而是根据 GLSurfaceView里创建的SurfaceTexture之前,生成的有个纹理ID。这个纹理ID一方面跟SurfaceTexture是绑定在 一起的,另一方面跟DirectDrawer绑定,而SurfaceTexture作渲染载体。

3、参考链接里有,有人为了解决问题,给出了下面三段代码:

@Overridepublic void onDrawFrame(GL10 gl){float[] mtx = new float[16];mSurface.updateTexImage();mSurface.getTransformMatrix(mtx); mDirectVideo.draw(mtx);}

private float[] transformTextureCoordinates( float[] coords, float[] matrix){float[] result = new float[ coords.length ]; float[] vt = new float[4];for ( int i = 0 ; i < coords.length ; i += 2 ) {float[] v = { coords[i], coords[i+1], 0 , 1 };Matrix.multiplyMV(vt, 0, matrix, 0, v, 0);result[i] = vt[0];result[i+1] = vt[1];}return result;}

textureVerticesBuffer.clear();textureVerticesBuffer.put( transformTextureCoordinates( textureVertices, mtx ));textureVerticesBuffer.position(0);

我已经把代码都融入到了此demo,只不过在draw()方法里么有使用。原因是使用之后,得到的预览画面反而是变形的,而不用的话是ok的。上面的代码是得到SurfaceTexture的变换矩阵:mSurface.getTransformMatrix

然后将此矩阵传递给draw(),在draw的时候对textureVerticesBuffer作一个变化,然后再画。

下图是未加这个矩阵变换效果时:

下图为使用了变换矩阵,划片扭曲的还真说不上来咋扭曲的,但足以说明OpenGL ES在渲染效果上的强大,就是设置了个矩阵,不用一帧帧处理,就能得到不一样显示效果。

-----------------------------本文系原创,转载请注明作者yanzi1225627

版本号:PlayCamera_V3.0.0[-6-22].zip

CSDN下载链接:/detail/yanzi1225627/7547263

百度云盘:

附个OpenGL ES简明教程:/android-20427-1-1.html

本内容不代表本网观点和政治立场,如有侵犯你的权益请联系我们处理。
网友评论
网友评论仅供其表达个人看法,并不表明网站立场。