解決OpenCV JavaCameraView相機preview方向問題


    網上找了很多解決都是有問題的,研究了半天源碼解決了這個問題。我是從整個相機啟動和數據傳輸的過程着手的,這里撿重點介紹一下,最后會貼上修改后的兩個源文件。

    首先要知道一個概念。

    圖里的小圓圈是Home按鍵,不是攝像頭:)

    現在問題就是在什么地方進行旋轉,如何旋轉。這就需要了解JavaCameraView類的工作流程了。JavaCameraView實現了父類CameraBridgeViewBase的抽象方法connectCamera,這個方法主要做兩件事

1.初始化相機,包括選擇啟動哪個相機,選擇preview的Frame大小等等。這個過程是在UI線程實現的,因此onPreviewFrame也是在UI線程調用,當相機啟動后數據就會傳至onPreviewFrame。onPreviewFrame不處理數據只是將數據存好然后通知另一個線程去處理。

2.就是啟動那個處理onPreviewFrame存儲數據的線程。

JavaCameraView類的問題就是在這第1,第2步驟里都沒有區分SurfaceView和相機Frame的坐標是不同的。比如,在第1步驟里,選擇相機preview大小時,preview的height最大不能超過SurfaceView的width,也就是說相機Frame的高對應着SurfaceView的寬而相機Frame的寬對應着SurfaceView的高。因此所有涉及到SurfaceView寬和高的地方都需要修改,因為源碼都用的相機Frame的寬和高。最重要的就是第1步驟里會調用AllocateCache函數初始化一個Bitmap,之后在第2步驟里將相機Frame轉為這個Bitmap,然后直接畫到SurfaceView對應的畫布上的。因此這個Bitmap的寬和高應該跟相機Frame的寬和高正好相反,因為相機Frame的數據是順時針旋轉90度后使用的。在第2步驟里,相機Frame轉為Bitmap之前需要順時針旋轉相機Frame數據,旋轉后相機Frame和Bitmap就一致了。這個旋轉相機Frame的時機我選擇在JavaCameraFrame類實現,這個類很簡單它有一個Mat成員,onPreviewFrame每當有數據時就是存到這個成員里,這個Mat是Yuv420sp格式的。但是它實現了CvCameraViewFrame接口的gray,rgba方法,可直接返回Yuv420sp對應的灰度圖和rgba圖,此時返回的是相機Frame那個方向的。我在gray,rgba方法里對返回結果進行旋轉。我認為這個時機很好,因為通過onCameraFrame傳給JavaCameraView類的用戶的數據就是JavaCameraFrame類型,在onCameraFrame函數里用戶像往常一樣直接調用gray,rgba方法就可得到方向正確的Mat,且這個時機在相機Frame轉為Bitmap之前,當轉換時相機Frame的數據已經被旋轉到正確方向了。

水平一般,能力有限。說的可能不清楚,貼上代碼,所有我修改的地方都注釋了#Modified,修改的地方不是很多。

 

1.CameraBridgeViewBase

  1 package org.opencv.android;
  2 
  3 import java.util.List;
  4 
  5 import org.opencv.BuildConfig;
  6 import org.opencv.R;
  7 import org.opencv.core.Mat;
  8 import org.opencv.core.Size;
  9 
 10 import android.app.Activity;
 11 import android.app.AlertDialog;
 12 import android.content.Context;
 13 import android.content.DialogInterface;
 14 import android.content.res.TypedArray;
 15 import android.graphics.Bitmap;
 16 import android.graphics.Canvas;
 17 import android.graphics.Rect;
 18 import android.util.AttributeSet;
 19 import android.util.Log;
 20 import android.view.SurfaceHolder;
 21 import android.view.SurfaceView;
 22 
 23 /**
 24  * This is a basic class, implementing the interaction with Camera and OpenCV library.
 25  * The main responsibility of it - is to control when camera can be enabled, process the frame,
 26  * call external listener to make any adjustments to the frame and then draw the resulting
 27  * frame to the screen.
 28  * The clients shall implement CvCameraViewListener.
 29  */
 30 public abstract class CameraBridgeViewBase extends SurfaceView implements SurfaceHolder.Callback {
 31 
 32     private static final String TAG = "CameraBridge";
 33     private static final int MAX_UNSPECIFIED = -1;
 34     private static final int STOPPED = 0;
 35     private static final int STARTED = 1;
 36 
 37     private int mState = STOPPED;
 38     private Bitmap mCacheBitmap;
 39     private CvCameraViewListener2 mListener;
 40     private boolean mSurfaceExist;
 41     private final Object mSyncObject = new Object();
 42 
 43     protected int mFrameWidth;
 44     protected int mFrameHeight;
 45     protected int mMaxHeight;
 46     protected int mMaxWidth;
 47     protected float mScale = 0;
 48     protected int mPreviewFormat = RGBA;
 49     protected int mCameraIndex = CAMERA_ID_ANY;
 50     protected boolean mEnabled;
 51     protected FpsMeter mFpsMeter = null;
 52 
 53     public static final int CAMERA_ID_ANY   = -1;
 54     public static final int CAMERA_ID_BACK  = 99;
 55     public static final int CAMERA_ID_FRONT = 98;
 56     public static final int RGBA = 1;
 57     public static final int GRAY = 2;
 58 
 59     public CameraBridgeViewBase(Context context, int cameraId) {
 60         super(context);
 61         mCameraIndex = cameraId;
 62         getHolder().addCallback(this);
 63         mMaxWidth = MAX_UNSPECIFIED;
 64         mMaxHeight = MAX_UNSPECIFIED;
 65     }
 66 
 67     public CameraBridgeViewBase(Context context, AttributeSet attrs) {
 68         super(context, attrs);
 69 
 70         int count = attrs.getAttributeCount();
 71         Log.d(TAG, "Attr count: " + Integer.valueOf(count));
 72 
 73         TypedArray styledAttrs = getContext().obtainStyledAttributes(attrs, R.styleable.CameraBridgeViewBase);
 74         if (styledAttrs.getBoolean(R.styleable.CameraBridgeViewBase_show_fps, false))
 75             enableFpsMeter();
 76 
 77         mCameraIndex = styledAttrs.getInt(R.styleable.CameraBridgeViewBase_camera_id, -1);
 78 
 79         getHolder().addCallback(this);
 80         mMaxWidth = MAX_UNSPECIFIED;
 81         mMaxHeight = MAX_UNSPECIFIED;
 82         styledAttrs.recycle();
 83     }
 84 
 85     /**
 86      * Sets the camera index
 87      * @param cameraIndex new camera index
 88      */
 89     public void setCameraIndex(int cameraIndex) {
 90         this.mCameraIndex = cameraIndex;
 91     }
 92 
 93     public interface CvCameraViewListener {
 94         /**
 95          * This method is invoked when camera preview has started. After this method is invoked
 96          * the frames will start to be delivered to client via the onCameraFrame() callback.
 97          * @param width -  the width of the frames that will be delivered
 98          * @param height - the height of the frames that will be delivered
 99          */
100         public void onCameraViewStarted(int width, int height);
101 
102         /**
103          * This method is invoked when camera preview has been stopped for some reason.
104          * No frames will be delivered via onCameraFrame() callback after this method is called.
105          */
106         public void onCameraViewStopped();
107 
108         /**
109          * This method is invoked when delivery of the frame needs to be done.
110          * The returned values - is a modified frame which needs to be displayed on the screen.
111          * TODO: pass the parameters specifying the format of the frame (BPP, YUV or RGB and etc)
112          */
113         public Mat onCameraFrame(Mat inputFrame);
114     }
115 
116     public interface CvCameraViewListener2 {
117         /**
118          * This method is invoked when camera preview has started. After this method is invoked
119          * the frames will start to be delivered to client via the onCameraFrame() callback.
120          * @param width -  the width of the frames that will be delivered
121          * @param height - the height of the frames that will be delivered
122          */
123         public void onCameraViewStarted(int width, int height);
124 
125         /**
126          * This method is invoked when camera preview has been stopped for some reason.
127          * No frames will be delivered via onCameraFrame() callback after this method is called.
128          */
129         public void onCameraViewStopped();
130 
131         /**
132          * This method is invoked when delivery of the frame needs to be done.
133          * The returned values - is a modified frame which needs to be displayed on the screen.
134          * TODO: pass the parameters specifying the format of the frame (BPP, YUV or RGB and etc)
135          */
136         public Mat onCameraFrame(CvCameraViewFrame inputFrame);
137     };
138 
139     protected class CvCameraViewListenerAdapter implements CvCameraViewListener2  {
140         public CvCameraViewListenerAdapter(CvCameraViewListener oldStypeListener) {
141             mOldStyleListener = oldStypeListener;
142         }
143 
144         public void onCameraViewStarted(int width, int height) {
145             mOldStyleListener.onCameraViewStarted(width, height);
146         }
147 
148         public void onCameraViewStopped() {
149             mOldStyleListener.onCameraViewStopped();
150         }
151 
152         public Mat onCameraFrame(CvCameraViewFrame inputFrame) {
153              Mat result = null;
154              switch (mPreviewFormat) {
155                 case RGBA:
156                     result = mOldStyleListener.onCameraFrame(inputFrame.rgba());
157                     break;
158                 case GRAY:
159                     result = mOldStyleListener.onCameraFrame(inputFrame.gray());
160                     break;
161                 default:
162                     Log.e(TAG, "Invalid frame format! Only RGBA and Gray Scale are supported!");
163             };
164 
165             return result;
166         }
167 
168         public void setFrameFormat(int format) {
169             mPreviewFormat = format;
170         }
171 
172         private int mPreviewFormat = RGBA;
173         private CvCameraViewListener mOldStyleListener;
174     };
175 
176     /**
177      * This class interface is abstract representation of single frame from camera for onCameraFrame callback
178      * Attention: Do not use objects, that represents this interface out of onCameraFrame callback!
179      */
180     public interface CvCameraViewFrame {
181 
182         /**
183          * This method returns RGBA Mat with frame
184          */
185         public Mat rgba();
186 
187         /**
188          * This method returns single channel gray scale Mat with frame
189          */
190         public Mat gray();
191     };
192 
193     /*
194     重載SurfaceHolder.Callback的方法
195      */
196     /*
197     Access to the underlying surface is provided via the SurfaceHolder interface,
198     which can be retrieved by calling getHolder().
199     The Surface will be created for you while the SurfaceView's window is visible;
200     you should implement SurfaceHolder.Callback.surfaceCreated(SurfaceHolder)
201     and SurfaceHolder.Callback.surfaceDestroyed(SurfaceHolder) to discover when the
202     Surface is created and destroyed as the window is shown and hidden.
203     One of the purposes of this class is to provide a surface in which a secondary
204     thread can render into the screen. If you are going to use it this way,
205     you need to be aware of some threading semantics:
206     All SurfaceView and SurfaceHolder.Callback methods will be called from
207     the thread running the SurfaceView's window (typically the main thread of the application). They thus need to correctly synchronize with any state that is also touched by the drawing thread.
208     You must ensure that the drawing thread only touches the underlying Surface
209     while it is valid -- between SurfaceHolder.Callback.surfaceCreated()
210     and SurfaceHolder.Callback.surfaceDestroyed().
211      */
212     /*
213     This is called immediately after any structural changes (format or size)
214     have been made to the surface. You should at this point update the imagery
215     in the surface. This method is always called at least once,
216     after surfaceCreated(SurfaceHolder).
217      */
218     public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {
219         Log.d(TAG, "call surfaceChanged event");
220         synchronized(mSyncObject) {
221             if (!mSurfaceExist) {
222                 mSurfaceExist = true;
223                 checkCurrentState();
224             } else {
225                 /** Surface changed. We need to stop camera and restart with new parameters */
226                 /* Pretend that old surface has been destroyed */
227                 mSurfaceExist = false;
228                 checkCurrentState();
229                 /* Now use new surface. Say we have it now */
230                 mSurfaceExist = true;
231                 checkCurrentState();
232             }
233         }
234     }
235 
236     /*
237     This is called immediately after the surface is first created.
238     Implementations of this should start up whatever rendering code they desire.
239     Note that only one thread can ever draw into a Surface,
240     so you should not draw into the Surface here if your normal rendering
241     will be in another thread.
242      */
243     public void surfaceCreated(SurfaceHolder holder) {
244         /* Do nothing. Wait until surfaceChanged delivered */
245     }
246 
247     /*
248     This is called immediately before a surface is being destroyed.
249     After returning from this call, you should no longer try to access this surface.
250     If you have a rendering thread that directly accesses the surface,
251     you must ensure that thread is no longer touching the Surface before returning
252     from this function.
253      */
254     public void surfaceDestroyed(SurfaceHolder holder) {
255         synchronized(mSyncObject) {
256             mSurfaceExist = false;
257             checkCurrentState();
258         }
259     }
260 
261     /**
262      * This method is provided for clients, so they can enable the camera connection.
263      * The actual onCameraViewStarted callback will be delivered only after both this method is called and surface is available
264      */
265     public void enableView() {
266         synchronized(mSyncObject) {
267             mEnabled = true;
268             checkCurrentState();
269         }
270     }
271 
272     /**
273      * This method is provided for clients, so they can disable camera connection and stop
274      * the delivery of frames even though the surface view itself is not destroyed and still stays on the scren
275      */
276     public void disableView() {
277         synchronized(mSyncObject) {
278             mEnabled = false;
279             checkCurrentState();
280         }
281     }
282 
283     /**
284      * This method enables label with fps value on the screen
285      */
286     public void enableFpsMeter() {
287         if (mFpsMeter == null) {
288             mFpsMeter = new FpsMeter();
289             mFpsMeter.setResolution(mFrameWidth, mFrameHeight);
290         }
291     }
292 
293     public void disableFpsMeter() {
294             mFpsMeter = null;
295     }
296 
297     /**
298      *
299      * @param listener
300      */
301 
302     public void setCvCameraViewListener(CvCameraViewListener2 listener) {
303         mListener = listener;
304     }
305 
306     public void setCvCameraViewListener(CvCameraViewListener listener) {
307         CvCameraViewListenerAdapter adapter = new CvCameraViewListenerAdapter(listener);
308         adapter.setFrameFormat(mPreviewFormat);
309         mListener = adapter;
310     }
311 
312     /**
313      * This method sets the maximum size that camera frame is allowed to be. When selecting
314      * size - the biggest size which less or equal the size set will be selected.
315      * As an example - we set setMaxFrameSize(200,200) and we have 176x152 and 320x240 sizes. The
316      * preview frame will be selected with 176x152 size.
317      * This method is useful when need to restrict the size of preview frame for some reason (for example for video recording)
318      * @param maxWidth - the maximum width allowed for camera frame.
319      * @param maxHeight - the maximum height allowed for camera frame
320      */
321     public void setMaxFrameSize(int maxWidth, int maxHeight) {
322         mMaxWidth = maxWidth;
323         mMaxHeight = maxHeight;
324     }
325 
326     public void SetCaptureFormat(int format)
327     {
328         mPreviewFormat = format;
329         if (mListener instanceof CvCameraViewListenerAdapter) {
330             CvCameraViewListenerAdapter adapter = (CvCameraViewListenerAdapter) mListener;
331             adapter.setFrameFormat(mPreviewFormat);
332         }
333     }
334 
335     /**
336      * Called when mSyncObject lock is held
337      */
338     private void checkCurrentState() {
339         Log.d(TAG, "call checkCurrentState");
340         int targetState;
341         //enableView()將設置mEnabled為true,surfaceChanged()將設置mSurfaceExist
342         //getVisibility() == VISIBLE似乎總是成立的
343         //當surface准備好了且client設置enableView()時設置targetState為STARTED
344         if (mEnabled && mSurfaceExist && getVisibility() == VISIBLE) {
345             targetState = STARTED;
346         } else {
347             targetState = STOPPED;
348         }
349 
350         //mState初始值是STOPPED
351         //若目標狀態與當前狀態不同則退出當前狀態進入目標狀態
352         if (targetState != mState) {
353             /* The state change detected. Need to exit the current state and enter target state */
354             processExitState(mState);
355             mState = targetState;
356             processEnterState(mState);
357         }
358     }
359 
360     private void processEnterState(int state) {
361         Log.d(TAG, "call processEnterState: " + state);
362         switch(state) {
363         case STARTED:
364             //真正啟動相機的地方
365             onEnterStartedState();
366             if (mListener != null) {
367                 //進入STARTED狀態后若CameraBridgeViewBase類的成員CvCameraViewListener2 mListener
368                 //不為null則調用其onCameraViewStarted方法,通知mListener相機啟動了
369                 mListener.onCameraViewStarted(mFrameWidth, mFrameHeight);
370             }
371             break;
372         case STOPPED:
373             onEnterStoppedState();
374             if (mListener != null) {
375                 //進入STOPPED狀態后若CameraBridgeViewBase類的成員CvCameraViewListener2 mListener
376                 //不為null則調用其onCameraViewStopped方法,通知mListener相機停止了
377                 mListener.onCameraViewStopped();
378             }
379             break;
380         };
381     }
382 
383     private void processExitState(int state) {
384         Log.d(TAG, "call processExitState: " + state);
385         switch(state) {
386         case STARTED:
387             onExitStartedState();
388             break;
389         case STOPPED:
390             onExitStoppedState();
391             break;
392         };
393     }
394 
395     private void onEnterStoppedState() {
396         /* nothing to do */
397     }
398 
399     private void onExitStoppedState() {
400         /* nothing to do */
401     }
402 
403     // NOTE: The order of bitmap constructor and camera connection is important for android 4.1.x
404     // Bitmap must be constructed before surface
405     private void onEnterStartedState() {
406         Log.d(TAG, "call onEnterStartedState");
407         /* Connect camera */
408         //connectCamera的參數是CameraBridgeViewBase的width,height
409         if (!connectCamera(getWidth(), getHeight())) {
410             AlertDialog ad = new AlertDialog.Builder(getContext()).create();
411             ad.setCancelable(false); // This blocks the 'BACK' button
412             ad.setMessage("It seems that you device does not support camera (or it is locked). Application will be closed.");
413             ad.setButton(DialogInterface.BUTTON_NEUTRAL,  "OK", new DialogInterface.OnClickListener() {
414                 public void onClick(DialogInterface dialog, int which) {
415                     dialog.dismiss();
416                     ((Activity) getContext()).finish();
417                 }
418             });
419             ad.show();
420 
421         }
422     }
423 
424     private void onExitStartedState() {
425         disconnectCamera();
426         if (mCacheBitmap != null) {
427             mCacheBitmap.recycle();
428         }
429     }
430 
431     /*
432         onPreviewFrame在UI線程被調用,它存好數據后通知另一個線程處理
433         另一個線程就調用這個方法處理數據
434         onPreviewFrame相當於生產者,另一個線程相當於消費者
435         當使用JavaCameraView類時frame是JavaCameraFrame類型
436         其通過OpenCV實現了接口
437      */
438     /**
439      * This method shall be called by the subclasses when they have valid
440      * object and want it to be delivered to external client (via callback) and
441      * then displayed on the screen.
442      * @param frame - the current frame to be delivered
443      */
444     protected void deliverAndDrawFrame(CvCameraViewFrame frame) {
445         Mat modified;
446 
447         if (mListener != null) {
448             //CvCameraViewListener2 mListener是client指定的
449             //這里調用客戶重載的接口方法且接收返回值
450             //這里都是在數據處理線程里執行的
451             modified = mListener.onCameraFrame(frame);
452         } else {
453             //若client沒指定CvCameraViewListener2 mListener即client不准備處理preview數據
454             //則modified設置為
455             //onPreviewFrame傳回的數據轉換成的rgba Mat
456             modified = frame.rgba();
457         }
458 
459         //Log Mat的大小和Bitmap的大小
460         Log.d("FunnyAR","mScale: "+mScale+" modified.rows: "+modified.rows()
461                 +" modified.cols: "+modified.cols()+" mCacheBitmap.getWidth(): "+
462                 mCacheBitmap.getWidth()+" mCacheBitmap.getHeight() "+
463                 mCacheBitmap.getHeight());
464 
465         //標志modified轉Bitmap是否成功
466         boolean bmpValid = true;
467         //若確實有modified則將其轉為Bitmap
468         if (modified != null) {
469             try {
470                 Utils.matToBitmap(modified, mCacheBitmap);
471             } catch(Exception e) {
472                 Log.e(TAG, "Mat type: " + modified);
473                 Log.e(TAG, "Bitmap type: " + mCacheBitmap.getWidth() + "*" + mCacheBitmap.getHeight());
474                 Log.e(TAG, "Utils.matToBitmap() throws an exception: " + e.getMessage());
475                 bmpValid = false;
476             }
477         }
478         //轉換成功通過畫布畫到surface里
479         if (bmpValid && mCacheBitmap != null) {
480             Canvas canvas = getHolder().lockCanvas();
481             if (canvas != null) {
482                 canvas.drawColor(0, android.graphics.PorterDuff.Mode.CLEAR);
483                 if (BuildConfig.DEBUG)
484                     Log.d(TAG, "mStretch value: " + mScale);
485 
486                 if (mScale != 0) {
487                     canvas.drawBitmap(mCacheBitmap, new Rect(0,0,mCacheBitmap.getWidth(), mCacheBitmap.getHeight()),
488                          new Rect((int)((canvas.getWidth() - mScale*mCacheBitmap.getWidth()) / 2),
489                          (int)((canvas.getHeight() - mScale*mCacheBitmap.getHeight()) / 2),
490                          (int)((canvas.getWidth() - mScale*mCacheBitmap.getWidth()) / 2 + mScale*mCacheBitmap.getWidth()),
491                          (int)((canvas.getHeight() - mScale*mCacheBitmap.getHeight()) / 2 + mScale*mCacheBitmap.getHeight())), null);
492                 } else {
493                      canvas.drawBitmap(mCacheBitmap, new Rect(0,0,mCacheBitmap.getWidth(), mCacheBitmap.getHeight()),
494                          new Rect((canvas.getWidth() - mCacheBitmap.getWidth()) / 2,
495                          (canvas.getHeight() - mCacheBitmap.getHeight()) / 2,
496                          (canvas.getWidth() - mCacheBitmap.getWidth()) / 2 + mCacheBitmap.getWidth(),
497                          (canvas.getHeight() - mCacheBitmap.getHeight()) / 2 + mCacheBitmap.getHeight()), null);
498                 }
499 
500                 if (mFpsMeter != null) {
501                     mFpsMeter.measure();
502                     mFpsMeter.draw(canvas, 20, 30);
503                 }
504                 getHolder().unlockCanvasAndPost(canvas);
505             }
506         }
507     }
508 
509     /**
510      * This method is invoked shall perform concrete operation to initialize the camera.
511      * CONTRACT: as a result of this method variables mFrameWidth and mFrameHeight MUST be
512      * initialized with the size of the Camera frames that will be delivered to external processor.
513      * @param width - the width of this SurfaceView
514      * @param height - the height of this SurfaceView
515      */
516     //具體啟動相機的過程由子類實現
517     protected abstract boolean connectCamera(int width, int height);
518 
519     /**
520      * Disconnects and release the particular camera object being connected to this surface view.
521      * Called when syncObject lock is held
522      */
523     protected abstract void disconnectCamera();
524 
525     // NOTE: On Android 4.1.x the function must be called before SurfaceTexture constructor!
526     protected void AllocateCache()
527     {
528         //mCacheBitmap = Bitmap.createBitmap(mFrameWidth, mFrameHeight, Bitmap.Config.ARGB_8888);
529         //#Modified portrait step2
530         //為了方向正確mCacheBitmap存儲的時相機frame旋轉90度之后的數據
531         //旋轉90度后mFrameWidth,mFrameHeight互換
532         int portraitWidth=mFrameHeight;
533         int portraitHeight=mFrameWidth;
534         mCacheBitmap = Bitmap.createBitmap(portraitWidth, portraitHeight, Bitmap.Config.ARGB_8888);
535     }
536 
537     public interface ListItemAccessor {
538         public int getWidth(Object obj);
539         public int getHeight(Object obj);
540     };
541 
542     /**
543      * This helper method can be called by subclasses to select camera preview size.
544      * It goes over the list of the supported preview sizes and selects the maximum one which
545      * fits both values set via setMaxFrameSize() and surface frame allocated for this view
546      * @param supportedSizes
547      * @param surfaceWidth
548      * @param surfaceHeight
549      * @return optimal frame size
550      */
551     protected Size calculateCameraFrameSize(List<?> supportedSizes, ListItemAccessor accessor, int surfaceWidth, int surfaceHeight) {
552         //選擇一個相機frame大小
553         int calcWidth = 0;
554         int calcHeight = 0;
555 
556         //允許的最大width和height
557         //#Modified step4
558         //相機Frame的mMaxWidth應該與surface的surfaceHeight比
559         //相機Frame的mMaxHeight應該與surface的surfaceWidth比
560         //int maxAllowedWidth = (mMaxWidth != MAX_UNSPECIFIED && mMaxWidth < surfaceWidth)? mMaxWidth : surfaceWidth;
561         //int maxAllowedHeight = (mMaxHeight != MAX_UNSPECIFIED && mMaxHeight < surfaceHeight)? mMaxHeight : surfaceHeight;
562         int maxAllowedWidth = (mMaxWidth != MAX_UNSPECIFIED && mMaxWidth < surfaceHeight)? mMaxWidth : surfaceHeight;
563         int maxAllowedHeight = (mMaxHeight != MAX_UNSPECIFIED && mMaxHeight < surfaceWidth)? mMaxHeight : surfaceWidth;
564 
565         for (Object size : supportedSizes) {
566             int width = accessor.getWidth(size);
567             int height = accessor.getHeight(size);
568 
569             //在允許的范圍內選擇最大的size
570             //client是可通過設置小的mMaxWidth,mMaxHeight來選擇低分辨率frame的
571             if (width <= maxAllowedWidth && height <= maxAllowedHeight) {
572                 if (width >= calcWidth && height >= calcHeight) {
573                     calcWidth = (int) width;
574                     calcHeight = (int) height;
575                 }
576             }
577         }
578 
579         return new Size(calcWidth, calcHeight);
580     }
581 }

 2.JavaCameraView

package org.opencv.android;

import java.util.List;

import android.content.Context;
import android.graphics.ImageFormat;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.os.Build;
import android.util.AttributeSet;
import android.util.Log;
import android.view.ViewGroup.LayoutParams;

import org.opencv.BuildConfig;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.Size;
import org.opencv.imgproc.Imgproc;

/**
 * This class is an implementation of the Bridge View between OpenCV and Java Camera.
 * This class relays on the functionality available in base class and only implements
 * required functions:
 * connectCamera - opens Java camera and sets the PreviewCallback to be delivered.
 * disconnectCamera - closes the camera and stops preview.
 * When frame is delivered via callback from Camera - it processed via OpenCV to be
 * converted to RGBA32 and then passed to the external callback for modifications if required.
 */
public class JavaCameraView extends CameraBridgeViewBase implements PreviewCallback {

    private static final int MAGIC_TEXTURE_ID = 10;
    private static final String TAG = "JavaCameraView";

    private byte mBuffer[];
    private Mat[] mFrameChain;
    private int mChainIdx = 0;
    private Thread mThread;
    private boolean mStopThread;

    protected Camera mCamera;
    protected JavaCameraFrame[] mCameraFrame;
    private SurfaceTexture mSurfaceTexture;
    private int mPreviewFormat = ImageFormat.NV21;

    public static class JavaCameraSizeAccessor implements ListItemAccessor {

        @Override
        public int getWidth(Object obj) {
            Camera.Size size = (Camera.Size) obj;
            return size.width;
        }

        @Override
        public int getHeight(Object obj) {
            Camera.Size size = (Camera.Size) obj;
            return size.height;
        }
    }

    public JavaCameraView(Context context, int cameraId) {
        super(context, cameraId);
    }

    public JavaCameraView(Context context, AttributeSet attrs) {
        super(context, attrs);
    }

    //傳入的是JavaCameraView的width,height
    protected boolean initializeCamera(int width, int height) {
        Log.d(TAG, "Initialize java camera");
        boolean result = true;
        synchronized (this){
            mCamera = null;
            //mCameraIndex是指定相機的類型,是應用里的設置不是指相機ID
            //相機ID需根據類型找
            //繼承父類CameraBridgeViewBase
            //初值為CAMERA_ID_ANY
            if (mCameraIndex == CAMERA_ID_ANY) {
                Log.d(TAG, "Trying to open camera with old open()");
                try {
                    //先嘗試不指定相機類型啟動相機
                    mCamera = Camera.open();
                }
                catch (Exception e){
                    Log.e(TAG, "Camera is not available (in use or does not exist): " + e.getLocalizedMessage());
                }

                if(mCamera == null && Build.VERSION.SDK_INT >= Build.VERSION_CODES.GINGERBREAD) {
                    boolean connected = false;
                    for (int camIdx = 0; camIdx < Camera.getNumberOfCameras(); ++camIdx) {
                        Log.d(TAG, "Trying to open camera with new open(" + Integer.valueOf(camIdx) + ")");
                        try {
                            //若不指定相機類型啟動相機失敗則遍歷所有相機ID一個個嘗試啟動,一旦成功
                            //就選擇當前成功啟動的相機
                            mCamera = Camera.open(camIdx);
                            connected = true;
                        } catch (RuntimeException e) {
                            Log.e(TAG, "Camera #" + camIdx + "failed to open: " + e.getLocalizedMessage());
                        }
                        if (connected) break;
                    }
                }
            } else {
                //這里是指定相機類型的情況
                if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.GINGERBREAD) {
                    int localCameraIndex = mCameraIndex;
                    if (mCameraIndex == CAMERA_ID_BACK) {
                        Log.i(TAG, "Trying to open back camera");
                        Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
                        //根據相機類型找此類型對應的相機ID
                        for (int camIdx = 0; camIdx < Camera.getNumberOfCameras(); ++camIdx) {
                            Camera.getCameraInfo( camIdx, cameraInfo );
                            if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_BACK) {
                                localCameraIndex = camIdx;
                                break;
                            }
                        }
                    } else if (mCameraIndex == CAMERA_ID_FRONT) {
                        Log.i(TAG, "Trying to open front camera");
                        Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
                        for (int camIdx = 0; camIdx < Camera.getNumberOfCameras(); ++camIdx) {
                            Camera.getCameraInfo( camIdx, cameraInfo );
                            if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
                                localCameraIndex = camIdx;
                                break;
                            }
                        }
                    }
                    if (localCameraIndex == CAMERA_ID_BACK) {
                        //localCameraIndex初賦值為CAMERA_ID_BACK類型,指定要啟動背面相機
                        //若有背面相機此處localCameraIndex值已經被賦值為背面相機的相機ID了
                        Log.e(TAG, "Back camera not found!");
                    } else if (localCameraIndex == CAMERA_ID_FRONT) {
                        Log.e(TAG, "Front camera not found!");
                    } else {
                        Log.d(TAG, "Trying to open camera with new open(" + Integer.valueOf(localCameraIndex) + ")");
                        try {
                            //根據找到的相機ID啟動相機
                            mCamera = Camera.open(localCameraIndex);
                        } catch (RuntimeException e) {
                            Log.e(TAG, "Camera #" + localCameraIndex + "failed to open: " + e.getLocalizedMessage());
                        }
                    }
                }
            }

            //若啟動相機失敗則返回false
            if (mCamera == null)
                return false;

            /* Now set camera parameters */
            try {
                Camera.Parameters params = mCamera.getParameters();
                Log.d(TAG, "getSupportedPreviewSizes()");
                List<android.hardware.Camera.Size> sizes = params.getSupportedPreviewSizes();

                if (sizes != null) {
                    //選擇預覽size
                    /* Select the size that fits surface considering maximum size allowed */
                    Size frameSize = calculateCameraFrameSize(sizes, new JavaCameraSizeAccessor(), width, height);
                    //這里width,height是connectCamera(getWidth(), getHeight())傳進來的
                    //是surfaceView的大小也是surface的大小
                    //Log相機frame大小和surface大小
                    Log.d("FunnyAR","surface width: "+width+" surface height: "+height+
                            "frameSize: "+frameSize.toString());

                    //選擇預覽格式
                    /* Image format NV21 causes issues in the Android emulators */
                    if (Build.FINGERPRINT.startsWith("generic")
                            || Build.FINGERPRINT.startsWith("unknown")
                            || Build.MODEL.contains("google_sdk")
                            || Build.MODEL.contains("Emulator")
                            || Build.MODEL.contains("Android SDK built for x86")
                            || Build.MANUFACTURER.contains("Genymotion")
                            || (Build.BRAND.startsWith("generic") && Build.DEVICE.startsWith("generic"))
                            || "google_sdk".equals(Build.PRODUCT))
                        params.setPreviewFormat(ImageFormat.YV12);  // "generic" or "android" = android emulator
                    else
                        params.setPreviewFormat(ImageFormat.NV21);

                    //預覽格式記錄到成員變量里
                    mPreviewFormat = params.getPreviewFormat();

                    Log.d(TAG, "Set preview size to " + Integer.valueOf((int)frameSize.width) + "x" + Integer.valueOf((int)frameSize.height));
                    params.setPreviewSize((int)frameSize.width, (int)frameSize.height);

                    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.ICE_CREAM_SANDWICH && !android.os.Build.MODEL.equals("GT-I9100"))
                        params.setRecordingHint(true);

                    //JavaCameraView的聚焦模式也是寫定下的
                    List<String> FocusModes = params.getSupportedFocusModes();
                    if (FocusModes != null && FocusModes.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO))
                    {
                        params.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
                    }

                    mCamera.setParameters(params);
                    params = mCamera.getParameters();

                    //設置frame大小
                    mFrameWidth = params.getPreviewSize().width;
                    mFrameHeight = params.getPreviewSize().height;

                    //這里涉及到縮放
                    /*
                        #Modified portrait step1
                        為了在deliverAndDrawFrame里往畫布上畫時應用縮放
                        <JavaCameraView>里
                        android:layout_width="match_parent"
                        android:layout_height="match_parent"
                        若又想指定縮放后的大小可將<JavaCameraView>放在一個有大小的
                        LinearLayout里
                        且當方向是portrait時比率是
                        surface的width/相機frame的mFrameHeight
                        surface的height/相機frame的mFrameWidth
                        若不想設置<JavaCameraView>則這里直接去掉if語句應該也可
                     */
                    if ((getLayoutParams().width == LayoutParams.MATCH_PARENT) && (getLayoutParams().height == LayoutParams.MATCH_PARENT))
                        //mScale = Math.min(((float)height)/mFrameHeight, ((float)width)/mFrameWidth);
                        mScale = Math.min(((float)width)/mFrameHeight, ((float)height)/mFrameWidth);
                    else
                        mScale = 0;

                    //Log縮放和相機Frame大小
                    Log.d("FunnyAR","mScale: "+mScale+" mFrameWidth: "+mFrameWidth+
                            " mFrameHeight: "+mFrameHeight);

                    if (mFpsMeter != null) {
                        mFpsMeter.setResolution(mFrameWidth, mFrameHeight);
                    }

                    //算frame的字節大小,設置相應大小的緩沖區接收數據
                    //像素個數
                    int size = mFrameWidth * mFrameHeight;
                    //像素個數x當前格式每個像素所需bit個數/一個字節8bit==frame所需byte數
                    size  = size * ImageFormat.getBitsPerPixel(params.getPreviewFormat()) / 8;
                    mBuffer = new byte[size];

                    /*
                    Adds a pre-allocated buffer to the preview callback buffer queue.
                    Applications can add one or more buffers to the queue.
                    When a preview frame arrives and there is still at least
                    one available buffer, the buffer will be used and removed from the queue.
                    Then preview callback is invoked with the buffer.
                    If a frame arrives and there is no buffer left, the frame is discarded.
                    Applications should add buffers back when they finish processing the data
                     in them.
                     */
                    /*
                    This method is only necessary when setPreviewCallbackWithBuffer(PreviewCallback)
                     is used. When setPreviewCallback(PreviewCallback) or
                     setOneShotPreviewCallback(PreviewCallback) are used,
                     buffers are automatically allocated.
                     When a supplied buffer is too small to hold the preview frame data,
                     preview callback will return null and the buffer will be removed from the
                     buffer queue.
                     */
                    mCamera.addCallbackBuffer(mBuffer);
                    /*
                    Installs a callback to be invoked for every preview frame,
                    using buffers supplied with addCallbackBuffer(byte[]),
                    in addition to displaying them on the screen.
                    he callback will be repeatedly called for as long as preview is active
                    and buffers are available. Any other preview callbacks are overridden.
                     */
                    mCamera.setPreviewCallbackWithBuffer(this);

                    //一個Mat數組
                    //注意Yuv420sp格式
                    mFrameChain = new Mat[2];
                    mFrameChain[0] = new Mat(mFrameHeight + (mFrameHeight/2), mFrameWidth, CvType.CV_8UC1);
                    mFrameChain[1] = new Mat(mFrameHeight + (mFrameHeight/2), mFrameWidth, CvType.CV_8UC1);

                    //繼承的方法為繼承的Bitmap mCacheBitmap初始化內存
                    AllocateCache();

                    //JavaCameraFrame內部有對Mat的引用
                    //mCameraFrame[0].mYuvFrameData就是Mat mFrameChain[0]
                    mCameraFrame = new JavaCameraFrame[2];
                    mCameraFrame[0] = new JavaCameraFrame(mFrameChain[0], mFrameWidth, mFrameHeight);
                    mCameraFrame[1] = new JavaCameraFrame(mFrameChain[1], mFrameWidth, mFrameHeight);

                    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.HONEYCOMB) {
                        mSurfaceTexture = new SurfaceTexture(MAGIC_TEXTURE_ID);
                        mCamera.setPreviewTexture(mSurfaceTexture);
                    } else
                       mCamera.setPreviewDisplay(null);

                    /* Finally we are ready to start the preview */
                    Log.d(TAG, "startPreview");
                    mCamera.startPreview();
                }
                else
                    result = false;
            } catch (Exception e) {
                result = false;
                e.printStackTrace();
            }
        }

        return result;
    }

    protected void releaseCamera() {
        synchronized (this) {
            if (mCamera != null) {
                mCamera.stopPreview();
                mCamera.setPreviewCallback(null);

                mCamera.release();
            }
            mCamera = null;
            if (mFrameChain != null) {
                mFrameChain[0].release();
                mFrameChain[1].release();
            }
            if (mCameraFrame != null) {
                mCameraFrame[0].release();
                mCameraFrame[1].release();
            }
        }
    }

    private boolean mCameraFrameReady = false;

    //重載父類的抽象方法,負責啟動相機
    @Override
    protected boolean connectCamera(int width, int height) {

        /* 1. We need to instantiate camera
         * 2. We need to start thread which will be getting frames
         */
        /* First step - initialize camera connection */
        Log.d(TAG, "Connecting to camera");
        //用initializeCamera函數實現初始化相機連接
        if (!initializeCamera(width, height))
            return false;

        mCameraFrameReady = false;

        /* now we can start update thread */
        Log.d(TAG, "Starting processing thread");
        mStopThread = false;
        mThread = new Thread(new CameraWorker());
        mThread.start();

        return true;
    }

    @Override
    protected void disconnectCamera() {
        /* 1. We need to stop thread which updating the frames
         * 2. Stop camera and release it
         */
        Log.d(TAG, "Disconnecting from camera");
        try {
            mStopThread = true;
            Log.d(TAG, "Notify thread");
            synchronized (this) {
                this.notify();
            }
            Log.d(TAG, "Waiting for thread");
            if (mThread != null)
                mThread.join();
        } catch (InterruptedException e) {
            e.printStackTrace();
        } finally {
            mThread =  null;
        }

        /* Now release camera */
        releaseCamera();

        mCameraFrameReady = false;
    }

    /*
    重載Camera.PreviewCallback onPreviewFrame方法
    這里onPreviewFrame時在UI線程里被處理的,因為相機時在主線程里被啟動的
    但是數據將被另一個線程取走處理
     */
    /*
    Callback interface used to deliver copies of preview frames as they are displayed.
    Called as preview frames are displayed. This callback is invoked
     on the event thread Camera.open(int) was called from.
     */
    @Override
    public void onPreviewFrame(byte[] frame, Camera arg1) {
        if (BuildConfig.DEBUG)
            Log.d(TAG, "Preview Frame received. Frame size: " + frame.length);
        synchronized (this) {
            //mChainIdx在0,1間切換,由另一個線程負責管理
            //OpenCV Java層特有的方法
            //mFrameChain[mChainIdx]的大小是1.5height x 1.0width,將數據存進去
            mFrameChain[mChainIdx].put(0, 0, frame);
            //設置標志表示數據存好了
            mCameraFrameReady = true;
            //喚醒一個等待當前JavaCameraView.this的線程

            this.notify();
        }
        /*
        onPreviewFrame處理數據時addCallbackBuffer()的buffer將出隊列被處理
        處理完后為了下次onPreviewFrame需再次將buffer給回調
         */
        if (mCamera != null)
            mCamera.addCallbackBuffer(mBuffer);
    }

    /*
    JavaCameraFrame實現CvCameraViewFrame的rgba(),gray()方法
    這個類型將通過deliverAndDrawFrame()里的mListener.onCameraFrame(frame)傳給用戶處理
    在JavaCameraFrame的接口里實現Mat的旋轉是最好的時機了
    如此client通過gray(),rgba()獲得的Mat就是方向portrait的了
    #Modified portrait step3
     */
    private class JavaCameraFrame implements CvCameraViewFrame {
        @Override
        public Mat gray() {
            //返回Mat里的選定區域,這跟Yuv420sp格式緊密相關
            //return mYuvFrameData.submat(0, mHeight, 0, mWidth);
            //#Modified step3.1
            Core.rotate(mYuvFrameData.submat(0, mHeight, 0, mWidth),
                    portrait_gray,Core.ROTATE_90_CLOCKWISE);
            return portrait_gray;
        }

        @Override
        public Mat rgba() {
            if (mPreviewFormat == ImageFormat.NV21)
                Imgproc.cvtColor(mYuvFrameData, mRgba, Imgproc.COLOR_YUV2RGBA_NV21, 4);
            else if (mPreviewFormat == ImageFormat.YV12)
                Imgproc.cvtColor(mYuvFrameData, mRgba, Imgproc.COLOR_YUV2RGB_I420, 4);  // COLOR_YUV2RGBA_YV12 produces inverted colors
            else
                throw new IllegalArgumentException("Preview Format can be NV21 or YV12");

            //#Modified step3.2
            Core.rotate(mYuvFrameData.submat(0, mHeight, 0, mWidth),
                    portrait_rgba,Core.ROTATE_90_CLOCKWISE);

            return portrait_rgba;
        }

        public JavaCameraFrame(Mat Yuv420sp, int width, int height) {
            super();
            mWidth = width;
            mHeight = height;
            //#Modified
            portrait_mHeight=mWidth;
            portrait_mWidth=mHeight;
            portrait_gray=new Mat(portrait_mHeight,portrait_mWidth,CvType.CV_8UC1);
            portrait_rgba=new Mat(portrait_mHeight,portrait_mWidth,CvType.CV_8UC4);
            mYuvFrameData = Yuv420sp;
            mRgba = new Mat();
        }

        public void release() {
            mRgba.release();
        }

        private Mat mYuvFrameData;
        private Mat mRgba;
        private int mWidth;
        private int mHeight;
        //#Modified
        private int portrait_mHeight;
        private int portrait_mWidth;
        private Mat portrait_gray;
        private Mat portrait_rgba;
    };

    private class CameraWorker implements Runnable {

        @Override
        public void run() {
            do {
                boolean hasFrame = false;
                synchronized (JavaCameraView.this) {
                    try {
                        //onPreviewFrame里frame准備好了會設置mCameraFrameReady為true然后喚醒此線程
                        //只要相機啟動着mStopThread就為false
                        //當相機啟動着且onPreviewFrame里frame沒准備好時線程就等待
                        //等待語句放在while里防止條件沒滿足時線程被喚醒
                        while (!mCameraFrameReady && !mStopThread) {
                            JavaCameraView.this.wait();
                        }
                    } catch (InterruptedException e) {
                        e.printStackTrace();
                    }
                    //線程被喚醒是因為onPreviewFrame里frame准備好了
                    if (mCameraFrameReady)
                    {
                        //mChainIdx在0,1之間切換表示mCameraFrame當前的緩沖區
                        mChainIdx = 1 - mChainIdx;
                        //設置mCameraFrameReady為false用來等下次onPreviewFrame里frame准備好
                        mCameraFrameReady = false;
                        //表示當前有frame可用
                        hasFrame = true;
                    }
                }

                //線程沒停止且有frame可用
                if (!mStopThread && hasFrame) {
                    //當前的緩沖區不為空則處理它
                    //mChainIdx初值為0,mChainIdx = 1 - mChainIdx設置其為1
                    //這里1 - mChainIdx為0
                    //之后mChainIdx值為1,mChainIdx = 1 - mChainIdx設置其為0
                    //這里1 - mChainIdx為1
                    //如此循環
                    //mCameraFrame[1 - mChainIdx].mYuvFrameData就是對mFrameChain[1 - mChainIdx]
                    //的引用,即JavaCameraFrame類里有對Mat的引用
                    if (!mFrameChain[1 - mChainIdx].empty())
                        deliverAndDrawFrame(mCameraFrame[1 - mChainIdx]);
                }
            } while (!mStopThread);
            Log.d(TAG, "Finish processing thread");
        }
    }
}

3.這是layout文件

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical"
    tools:context=".MainActivity">

    <!-- 這里1080px,1440px是硬編碼,適合redmi note4x屏幕 -->
    <!--<org.opencv.android.JavaCameraView
        android:id="@+id/javaCameraView"
        android:layout_width="1080px"
        android:layout_height="1440px" />-->

    <!--為了應用縮放-->
    <LinearLayout
        android:layout_width="1080px"
        android:layout_height="1440px"
        android:orientation="vertical">

        <org.opencv.android.JavaCameraView
            android:id="@+id/javaCameraView"
            android:layout_width="match_parent"
            android:layout_height="match_parent" />

    </LinearLayout>

    <TextView
        android:layout_width="1080px"
        android:layout_height="480px"
        android:text="FunnyAR!" />

</LinearLayout>

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM