Android教程網
  1. 首頁
  2. Android 技術
  3. Android 手機
  4. Android 系統教程
  5. Android 游戲
 Android教程網 >> Android技術 >> 關於Android編程 >> Android實戰技巧之三十三:android.hardware.camera2使用指南

Android實戰技巧之三十三:android.hardware.camera2使用指南

編輯:關於Android編程

API 21中將原來的camera API棄用轉而推薦使用新增的camera2 API,這是一個大的動作,因為新API換了架構,讓開發者用起來更難了。
先來看看camera2包架構示意圖:
title=
這裡引用了管道的概念將安卓設備和攝像頭之間聯通起來,系統向攝像頭發送Capture請求,而攝像頭會返回CameraMetadata。這一切建立在一個叫作CameraCaptureSession的會話中。

下面是camera2包中的主要類:
title=
其中CameraManager是那個站在高處統管所有攝像投設備(CameraDevice)的管理者,而每個CameraDevice自己會負責建立CameraCaptureSession以及建立CaptureRequest。CameraCharacteristics是CameraDevice的屬性描述類,非要做個對比的話,那麼它與原來的CameraInfo有相似性。
類圖中有著三個重要的callback,雖然這增加了閱讀代碼的難度,但是你必須要習慣,因為這是新包的風格。其中CameraCaptureSession.CaptureCallback將處理預覽和拍照圖片的工作,需要重點對待。

這些類是如何相互配合的?下面是簡單的流程圖。
title=

我是用SurfaceView作為顯示對象(當然還可以TextureView去顯示,詳見參考中的項目)
核心代碼如下:

        mCameraManager = (CameraManager) this.getSystemService(Context.CAMERA_SERVICE);
        mSurfaceView = (SurfaceView)findViewById(R.id.surfaceview);
        mSurfaceHolder = mSurfaceView.getHolder();
        mSurfaceHolder.addCallback(new SurfaceHolder.Callback() {
            @Override
            public void surfaceCreated(SurfaceHolder holder) {
                initCameraAndPreview();
            }
        });
    private void initCameraAndPreview() {
        Log.d(linc,init camera and preview);
        HandlerThread handlerThread = new HandlerThread(Camera2);
        handlerThread.start();
        mHandler = new Handler(handlerThread.getLooper());
        try {
            mCameraId = +CameraCharacteristics.LENS_FACING_FRONT;
            mImageReader = ImageReader.newInstance(mSurfaceView.getWidth(), mSurfaceView.getHeight(),
                    ImageFormat.JPEG,/*maxImages*/7);
            mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, mHandler);

            mCameraManager.openCamera(mCameraId, DeviceStateCallback, mHandler);
        } catch (CameraAccessException e) {
            Log.e(linc, open camera failed. + e.getMessage());
        }
    }
private CameraDevice.StateCallback DeviceStateCallback = new CameraDevice.StateCallback() {

        @Override
        public void onOpened(CameraDevice camera) {
            Log.d(linc,DeviceStateCallback:camera was opend.);
            mCameraOpenCloseLock.release();
            mCameraDevice = camera;
            try {
                createCameraCaptureSession();
            } catch (CameraAccessException e) {
                e.printStackTrace();
            }
        }
    };
    private void createCameraCaptureSession() throws CameraAccessException {
        Log.d(linc,createCameraCaptureSession);

        mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
        mPreviewBuilder.addTarget(mSurfaceHolder.getSurface());
        mState = STATE_PREVIEW;
        mCameraDevice.createCaptureSession(
                Arrays.asList(mSurfaceHolder.getSurface(), mImageReader.getSurface()),
                mSessionPreviewStateCallback, mHandler);
    }
private CameraCaptureSession.StateCallback mSessionPreviewStateCallback = new
            CameraCaptureSession.StateCallback() {

        @Override
        public void onConfigured(CameraCaptureSession session) {
            Log.d(linc,mSessionPreviewStateCallback onConfigured);
            mSession = session;
            try {
                mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE,
                        CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
                mPreviewBuilder.set(CaptureRequest.CONTROL_AE_MODE,
                        CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
                session.setRepeatingRequest(mPreviewBuilder.build(), mSessionCaptureCallback, mHandler);
            } catch (CameraAccessException e) {
                e.printStackTrace();
                Log.e(linc,set preview builder failed.+e.getMessage());
            }
        }
    };
private CameraCaptureSession.CaptureCallback mSessionCaptureCallback =
            new CameraCaptureSession.CaptureCallback() {

        @Override
        public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request,
                                       TotalCaptureResult result) {
//            Log.d(linc,mSessionCaptureCallback, onCaptureCompleted);
            mSession = session;
            checkState(result);
        }

        @Override
        public void onCaptureProgressed(CameraCaptureSession session, CaptureRequest request,
                                        CaptureResult partialResult) {
            Log.d(linc,mSessionCaptureCallback, onCaptureProgressed);
            mSession = session;
            checkState(partialResult);
        }

        private void checkState(CaptureResult result) {
            switch (mState) {
                case STATE_PREVIEW:
                    // NOTHING
                    break;
                case STATE_WAITING_CAPTURE:
                    int afState = result.get(CaptureResult.CONTROL_AF_STATE);

                    if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState ||
                            CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState
                            || CaptureResult.CONTROL_AF_STATE_PASSIVE_FOCUSED == afState
                            || CaptureResult.CONTROL_AF_STATE_PASSIVE_UNFOCUSED == afState) {
                        //do something like save picture
                    }
                    break;
            }
        }

    };

按下capture按鈕:

    public void onCapture(View view) {
        try {
            Log.i(linc, take picture);
            mState = STATE_WAITING_CAPTURE;
            mSession.setRepeatingRequest(mPreviewBuilder.build(), mSessionCaptureCallback, mHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

測試用genemotion模擬器,直接調用筆記本的攝像頭。
配置圖如下:
title=
demo界面如下圖:
title=

 

  1. 上一頁:
  2. 下一頁:
熱門文章
閱讀排行版
Copyright © Android教程網 All Rights Reserved