Android Camera 采用C/S架構,client 與server兩個獨立的線程之間(CameraService)使用Binder通信。
一 CameraService的注冊。
1.手機開機后,會走init.rc流程,init.rc會啟動MediaServer Service。
service media /system/bin/mediaserver
class main user root #### # google default #### # user media #### group audio camera inet net_bt net_bt_admin net_bw_acct drmrpc mediadrm media sdcard_r system net_bt_stack #### # google default #### # group audio camera inet net_bt net_bt_admin net_bw_acct drmrpc mediadrm #### ioprio rt 4
2.MediaServer的main函數位於frameworks/base/media/mediaserver/main_mediaserver.cpp中。
在Main_MediaServer.cpp的main函數中,CameraService完成了注冊
1 int main(int argc __unused, char** argv) 2 { 3 signal(SIGPIPE, SIG_IGN); 4 char value[PROPERTY_VALUE_MAX]; 5 bool doLog = (property_get("ro.test_harness", value, "0") > 0) && (atoi(value) == 1); 6 pid_t childPid; 7 .............................. 8 sp<ProcessState> proc(ProcessState::self()); 9 sp<IServiceManager> sm = defaultServiceManager(); 10 ALOGI("ServiceManager: %p", sm.get()); 11 AudioFlinger::instantiate(); 12 MediaPlayerService::instantiate(); 13 #ifdef MTK_AOSP_ENHANCEMENT 14 MemoryDumper::instantiate(); 15 #endif 16 CameraService::instantiate(); 17 ......................................
3. instantiate的實現在CameraService的父類中,
namespace android { template<typename SERVICE> class BinderService { public: static status_t publish(bool allowIsolated = false) { sp<IServiceManager> sm(defaultServiceManager()); return sm->addService( String16(SERVICE::getServiceName()), new SERVICE(), allowIsolated); } static void publishAndJoinThreadPool(bool allowIsolated = false) { publish(allowIsolated); joinThreadPool(); } static void instantiate() { publish(); } static status_t shutdown() { return NO_ERROR; } private: static void joinThreadPool() { sp<ProcessState> ps(ProcessState::self()); ps->startThreadPool(); ps->giveThreadPoolName(); IPCThreadState::self()->joinThreadPool(); } }; }; // namespace android
可以發現在publish()函數中,CameraService完成服務的注冊 。SERVICE是個模板,這里是注冊CameraService,所以可用CameraService代替
return sm->addService(String16(CameraService::getServiceName()), new CameraService());
這樣,Camera就在ServiceManager完成服務注冊,提供給client隨時使用。
二 client如何連上server端,並打開camera模塊
1.Client如何連接到server端。
我們從Camera.open()開始往framework進行分析,調用frameworks\base\core\java\android\hardware\Camera.java類的open方法 。
public static Camera open() { ............................................ return new Camera(cameraId); ............................................. }
這里調用了Camera的構造函數,在構造函數中調用了cameraInitVersion
private int cameraInitVersion(int cameraId, int halVersion) { .................................................. return native_setup(new WeakReference<Camera>(this), cameraId, halVersion, packageName); }
此后進入JNI層android_hardware_camera.cpp
// connect to camera service static jint android_hardware_Camera_native_setup(JNIEnv *env, jobject thiz, jobject weak_this, jint cameraId, jint halVersion, jstring clientPackageName) { ....................... camera = Camera::connect(cameraId, clientName, Camera::USE_CALLING_UID); ...................... sp<JNICameraContext> context = new JNICameraContext(env, weak_this, clazz, camera); ........................... camera->setListener(context); .......................... }
JNI函數里面,我們找到Camera C/S架構的客戶端了(即紅色加粗的那個Camera),它調用connect函數向服務端發送連接請求。JNICameraContext這個類是一個監聽類,用於處理底層Camera回調函數傳來的數據和消息
sp<Camera> Camera::connect(int cameraId, const String16& clientPackageName, int clientUid) { return CameraBaseT::connect(cameraId, clientPackageName, clientUid); }
template <typename TCam, typename TCamTraits> sp<TCam> CameraBase<TCam, TCamTraits>::connect(int cameraId, const String16& clientPackageName, int clientUid) { ALOGV("%s: connect", __FUNCTION__); sp<TCam> c = new TCam(cameraId); sp<TCamCallbacks> cl = c; status_t status = NO_ERROR; const sp<ICameraService>& cs = getCameraService(); if (cs != 0) { TCamConnectService fnConnectService = TCamTraits::fnConnectService; status = (cs.get()->*fnConnectService)(cl, cameraId, clientPackageName, clientUid, /*out*/ c->mCamera); } if (status == OK && c->mCamera != 0) { c->mCamera->asBinder()->linkToDeath(c); c->mStatus = NO_ERROR; } else { ALOGW("An error occurred while connecting to camera: %d", cameraId); c.clear(); } return c; }
// establish binder interface to camera service template <typename TCam, typename TCamTraits> const sp<ICameraService>& CameraBase<TCam, TCamTraits>::getCameraService() { Mutex::Autolock _l(gLock); if (gCameraService.get() == 0) { sp<IServiceManager> sm = defaultServiceManager(); sp<IBinder> binder; do { binder = sm->getService(String16(kCameraServiceName)); if (binder != 0) { break; } ALOGW("CameraService not published, waiting..."); usleep(kCameraServicePollDelay); } while(true); if (gDeathNotifier == NULL) { gDeathNotifier = new DeathNotifier(); } binder->linkToDeath(gDeathNotifier); gCameraService = interface_cast<ICameraService>(binder); } ALOGE_IF(gCameraService == 0, "no CameraService!?"); return gCameraService; }
此處終於獲得CameraService實例了,該CameraService實例是通過binder獲取的。
再來看fnConnectService是什么,
在Camera.cpp中,有
CameraTraits<Camera>::TCamConnectService CameraTraits<Camera>::fnConnectService =
&ICameraService::connect;
這樣也就是說fnConnectService就是使用CameraService調用connect,從這里開始,終於從進入了服務端的流程:
status_t CameraService::connect( const sp<ICameraClient>& cameraClient, int cameraId, const String16& clientPackageName, int clientUid, /*out*/ sp<ICamera>& device) { ......................... status_t status = validateConnect(cameraId, /*inout*/clientUid); .................................. if (!canConnectUnsafe(cameraId, clientPackageName, cameraClient->asBinder(), /*out*/clientTmp)) { return -EBUSY; } status = connectHelperLocked(/*out*/client, cameraClient, cameraId, clientPackageName, clientUid, callingPid);
device = client;
return OK; }
status_t CameraService::connectHelperLocked( /*out*/ sp<Client>& client, /*in*/ const sp<ICameraClient>& cameraClient, int cameraId, const String16& clientPackageName, int clientUid, int callingPid, int halVersion, bool legacyMode) { .................................... client = new CameraClient(this, cameraClient, clientPackageName, cameraId, facing, callingPid, clientUid, getpid(), legacyMode); ....................................
status_t status = connectFinishUnsafe(client, client->getRemote());
............................................
}
在connectHelpLocked,CameraService返回一個其實是它內部類的client——CameraClient。(注意此client是CameraService內部的CameraClient,不是Camera客戶端),此client還賦值給了device,而這個device,就是在客戶端CameraBase.cpp中c->mCamera,
status = (cs.get()->*fnConnectService)(cl, cameraId, clientPackageName, clientUid, /*out*/ c->mCamera);
而這個c->mCamera就是每一次客戶端Camera.cpp中調用一些函數(如preview/takepicture)時的mCamera,這樣每一次客戶端調用preview/takepicture,就直接調用的是CameraClient中的相關函數。這樣就真正建立了客戶端和服務端的關系。如Camera.cpp中的startPreview
// start preview mode status_t Camera::startPreview() { ALOGV("startPreview"); sp <ICamera> c = mCamera; if (c == 0) return NO_INIT; return c->startPreview(); }
其實直接就會調用到CameraService中CameraClient.cpp中
// start preview mode status_t CameraClient::startPreview() { LOG1("startPreview (pid %d)", getCallingPid()); return startCameraMode(CAMERA_PREVIEW_MODE); }
2.service端(其實就是CameraClient)的初始化和運行過程。
剛才我們知道,camera客戶端與服務端的調用關系,就是camera.cpp中的函數,對應到CameraClient中的函數。下面我們再看看CameraClient的初始化過程。
在connectFinishUnsafe中,
status_t status = client->initialize(mModule);
我們再來看CameraClient類的initialize函數
status_t CameraClient::initialize(camera_module_t *module) { int callingPid = getCallingPid(); status_t res; mHardware = new CameraHardwareInterface(camera_device_name); res = mHardware->initialize(&module->common); }
CameraClient的初始化就是:先實例化Camera Hal接口 CameraHardwareInterface,CameraHardwareInterface調用initialize()進入HAL層打開Camera底層驅動
status_t initialize(hw_module_t *module) { ALOGI("Opening camera %s", mName.string()); camera_module_t *cameraModule = reinterpret_cast<camera_module_t *>(module); camera_info info; status_t res = cameraModule->get_camera_info(atoi(mName.string()), &info); if (res != OK) return res; int rc = OK; if (module->module_api_version >= CAMERA_MODULE_API_VERSION_2_3 && info.device_version > CAMERA_DEVICE_API_VERSION_1_0) { // Open higher version camera device as HAL1.0 device. rc = cameraModule->open_legacy(module, mName.string(), CAMERA_DEVICE_API_VERSION_1_0, (hw_device_t **)&mDevice); } else { rc = CameraService::filterOpenErrorCode(module->methods->open( module, mName.string(), (hw_device_t **)&mDevice)); } if (rc != OK) { ALOGE("Could not open camera %s: %d", mName.string(), rc); return rc; } initHalPreviewWindow(); return rc; }
hardware->initialize(&mModule->common)中mModule模塊是一個結構體camera_module_t,他是怎么初始化的呢?我們發現CameraService里面有個函數
void CameraService::onFirstRef() { BnCameraService::onFirstRef(); if (hw_get_module(CAMERA_HARDWARE_MODULE_ID, (const hw_module_t **)&mModule) < 0) { LOGE("Could not load camera HAL module"); mNumberOfCameras = 0; } }
了解HAL層的都知道hw_get_module函數就是用來獲取模塊的Hal stub,這里通過CAMERA_HARDWARE_MODULE_ID 獲取Camera Hal層的代理stub,並賦值給mModule,后面就可通過操作mModule完成對Camera模塊的控制。那么onFirstRef()函數又是何時調用的?
onFirstRef()屬於其父類RefBase,該函數在強引用sp新增引用計數時調用。就是當 有sp包裝的類初始化的時候調用,那么camera是何時調用的呢?可以發現在
客戶端發起連接時候
sp Camera::connect(int cameraId)
{
LOGV("connect");
sp c = new Camera();
const sp& cs = getCameraService();
}
這個時候初始化了一個CameraService實例,且用Sp包裝,這個時候sp將新增計數,相應的CameraService實例里面onFirstRef()函數完成調用。
CameraService::connect()即實例化CameraClient並打開驅動,返回CameraClient的時候,就表明客戶端和服務端連接建立。Camera完成初始化。
總結:Camera的初始化流程簡單的說就是:
->先是系統注冊CameraService的服務
->AP層調用Camera.open()
->Camera.java調用JNI native_setup()
->JNI層調用 android_hardware_Camera_native_setup
-> HAL 客戶端(Camera.cpp)調用connect與服務端(CameraService.cpp)連接,並得到CameraService中的CameraClient的一個實例
->服務端CameraClient的初始化,實例化Camera Hal接口 CameraHardwareInterface
->CameraHardwareInterface 打開Camera驅動,初始化完畢
最終的結果就是客戶端會得到一個服務端CameraService中的CameraClient的一個實例,客戶端的每一個函數操作其實最終都是調用CameraClient的函數
參考文檔:
Android 4.0 Camera架構分析之Camera初始化 http://blog.csdn.net/dnfchan/article/details/7594590
Android 5.1 Camera Framework源碼