转载请注明出处,谢谢!

相机服务框架类

相机服务是如何运作的呢?若弄清这个问题,必须先了解组成Service Framework的各个类与Bind RPC的连接关系。图-1描述了在不同的三个部分中各个类与Binder RPC的关系。

(a)       Camera 类继承自ICameraClient类,负责在应用程序与相机服务间传递Binder RPC数据。

(b)       CameraService 类继承了ICameraService类,负责应用程序与相机服务间的连接。

(c)       CameraService::Client类继承自ICamera类,负责相机设备的设置,控制及来自相机设备的事件。

各个类与Binder RPC的具体关系:

(1)       android.hardware.Camera 中的本地方法通过JNI调用本地库中Camera的成员函数。

(2)       在应用程序连接相机服务时,Camera 通过BpCameraService(服务代理)与BnCamera Service (服务stub)进行Binder RPC操作(经由ICameraService接口执行Binder RPC交互)。

(3)       应用程序中请求相机设备或预览功能时,Camera通过BpCamera(服务代理)与bnCamera进行binder RPC操作(经由ICamera接口执行Binder RPC交互)。

(4)       当相机设备发生事件时,CameraService::Client 通过BpCameraClient与BnCamera进行Binder RPC操作(经由ICameraClient接口执行Binder RPC).

相机服务初始化

/frameworks/base/media/mediaserver/main_mediaserver.cpp

int main(int argc, char** argv)

{

sp<ProcessState> proc(ProcessState::self());

sp<IServiceManager> sm = defaultServiceManager();

LOGI("ServiceManager: %p", sm.get());

//add for coredump .check only in debug mode

{

char value[PROPERTY_VALUE_MAX];

property_get("ro.debuggable", value, "0");

if(value[0] == '1' )

{

struct rlimit rl;

rl.rlim_cur = -1;

rl.rlim_max = -1;

setrlimit(4,&rl);

}

}

VolumeManager::instantiate(); // volumemanager have to be started before audioflinger

AudioFlinger::instantiate();

MediaPlayerService::instantiate();

CameraService::instantiate();

AudioPolicyService::instantiate();

ProcessState::self()->startThreadPool();

IPCThreadState::self()->joinThreadPool();

}

查看CameraService::instantiate()源码遇到问题,到CameraService.cpp文件里面却找不到instantiate()这个方法,它在哪?

frameworks/base/media/mediaserver/main_mediaserver.cpp文件中发现蛛丝马迹:

#include <grp.h>

#include <binder/IPCThreadState.h>

#include <binder/ProcessState.h>

#include <binder/IServiceManager.h>

#include <utils/Log.h>

#include <AudioFlinger.h>

#include <CameraService.h>

#include <MediaPlayerService.h>

#include <AudioPolicyService.h>

#include <private/android_filesystem_config.h>

所以去frameworks/base/services/camera/libcameraservice/CameraService.h一探究竟。

class CameraService :

public BinderService<CameraService>,

public BnCameraService

{

class Client;

friend class BinderService<CameraService>;

public:

static char const* getServiceName() { return "media.camera"; }

CameraService();

virtual             ~CameraService();

virtual int32_t     getNumberOfCameras();

virtual status_t    getCameraInfo(int cameraId,

...

从以上定义可以看出CameraService继承于BinderService和BnCameraService所以CameraService::instantiate()可能继承父类的方法,到父类BinderService中一看,果不其然,其父类BinderService中有instantiate方法,并且是个静态方法。

frameworks/base/include/binder/BinderService.h

class BinderService

{

public:

static status_tpublish() {

sp<IServiceManager> sm(defaultServiceManager());

returnsm->addService(String16(SERVICE::getServiceName()), new SERVICE());

}

static void publishAndJoinThreadPool() {

sp<ProcessState> proc(ProcessState::self());

sp<IServiceManager> sm(defaultServiceManager());

sm->addService(String16(SERVICE::getServiceName()), new SERVICE());

ProcessState::self()->startThreadPool();

IPCThreadState::self()->joinThreadPool();

}

static void instantiate() { publish(); }

static status_t shutdown() {

return NO_ERROR;

}

};

}; // namespace android

可以发现在publish()函数中,CameraService完成服务的注册。这里面有个SERVICE,源码中有说明

template<typename SERVICE>

这表示SERVICE是个模板,这里是注册CameraService,所以可以用CameraService代替SERVICE,这段代码可以替换为:

return sm->addService(String16(CameraService::getServiceName()), new CameraService());

好了这样,Camera就在ServiceManager完成服务注册,提供给client随时使用。

main_mediaserver.cpp编译后生成out/target/product/sp8825ea/system/bin/mediaserver ;

Main_MediaServer主函数由init.rc在启动是调用,所以在设备开机的时候Camera就会注册一个服务,用作binder通信。

service media /system/bin/mediaserver

class main

user media

group system audio camera graphics inet net_bt net_bt_admin net_bw_acct drmrpc radio

ioprio rt 4

连接相机服务

在使用相机服务之前,应用程序必须先与相机服务连接起来,(在应用程序看来,连接就是调用open函数)。在连接的过程中,ICameraService的Binder RPC扮演连接桥梁的角色。在完成连接以后,相应的Binder RPC关系就确定了。利用它可以设置相机设备,传递相应命令、接收相应的事件。

连接过程

(1)       应用程序调用android.hardware.Camera的open()方法;

(2)       Open方法调用native_setup();

(3)       native_setup()方法通过JNI调用android_hardware_Camera_native_setup()函数。

(4)       android_hardware_Camera_native_setup()函数调用Camera的connect()成员函数。

(5)       Camera的connect方法从Context Manager获取相机服务信息,生成服务代理(BpCameraService)后,通过Binder RPC连接到相机服务stub(BnCameraService)。

(6)       实际连接是由CameraService的connect()方法进行处理的。

APP层:

如我的demo中:

public void surfaceCreated(SurfaceHolder holder)

{

// int nCameras = Camera.getNumberOfCameras();

mCamera = Camera.open(0);

try {

Log.i(TAG, "SurfaceHolder.Callback:surface Created");

mCamera.setPreviewDisplay(mSurfaceHolder);// set the surface to be

// used for live preview

mCamera.setPreviewCallback(previewCallback);

mCamera.setErrorCallback(errorCallback);

mCamera.startPreview();

.......................................

framewrok层

frameworks/base/core/java/android/hardware/Camera.java

public static Camera open() {

int numberOfCameras = getNumber OfCameras();

CameraInfo cameraInfo = new CameraInfo();

for (int i = 0; i < numberOfCameras; i++) {

getCameraInfo(i, cameraInfo);

if (cameraInfo.facing == CameraInfo.CAMERA_FACING_BACK) {

return new Camera(i);

}

}

return null;

}

Camera(int cameraId) {

mShutterCallback = null;

mRawImageCallback = null;

mJpegCallback = null;

mPreviewCallback = null;

mPostviewCallback = null;

mZoomListener = null;

Looper looper;

if ((looper = Looper.myLooper()) != null) {

mEventHandler = new EventHandler(this, looper);

} else if ((looper = Looper.getMainLooper()) != null) {

mEventHandler = new EventHandler(this, looper);

} else {

mEventHandler = null;

}

native_setup(new WeakReference<Camera>(this), cameraId);

}

JNI层

frameworks/base/core/jni

 

static JNINativeMethod camMethods[] = {

{ "getNumberOfCameras",

"()I",

(void *)android_hardware_Camera_getNumberOfCameras },

{ "getCameraInfo",

"(ILandroid/hardware/Camera$CameraInfo;)V",

(void*)android_hardware_Camera_getCameraInfo },

{ "native_setup",

"(Ljava/lang/Object;I)V",

(void*)android_hardware_Camera_native_setup },

{ "native_release",

"()V",

(void*)android_hardware_Camera_release },

{ "setPreviewDisplay",

"(Landroid/view/Surface;)V",

(void *)android_hardware_Camera_setPreviewDisplay },

看android_hardware_Camera_native_setup原型:

static voidandroid_hardware_Camera_native_setup(JNIEnv *env, jobject thiz,

jobject weak_this, jint cameraId)

{

sp<Camera> camera = Camera::connect(cameraId);

if (camera == NULL) {

jniThrowRuntimeException(env, "Fail to connect to camera service");

return;

}

// make sure camera hardware is alive

if (camera->getStatus() != NO_ERROR) {

jniThrowRuntimeException(env, "Camera initialization failed");

return;

}

jclass clazz = env->GetObjectClass(thiz);

if (clazz == NULL) {

jniThrowRuntimeException(env, "Can't find android/hardware/Camera");

return;

}

// We use a weak reference so the Camera object can be garbage collected.

// The reference is only used as a proxy for callbacks.

sp<JNICameraContext> context = new JNICameraContext(env, weak_this, clazz, camera);

context->incStrong(thiz);

camera->setListener(context);

// save context in opaque field

env->SetIntField(thiz, fields.context, (int)context.get());

}

本地库

frameworks/base/libs/camera/Camera.cpp

sp<Camera> Camera::connect(int cameraId)

{

LOGV("connect");

sp<Camera> c = new Camera();

const sp<ICameraService>& cs = getCameraService();

if (cs != 0) {

c->mCamera = cs->connect(c, cameraId);

}

if (c->mCamera != 0) {

c->mCamera->asBinder()->linkToDeath(c);

c->mStatus = NO_ERROR;

} else {

c.clear();

}

return c;

}

frameworks/base/libs/camera/ICameraService.cpp

virtual sp<ICamera> connect(const sp<ICameraClient>& cameraClient, int cameraId)

{

Parcel data, reply;

data.writeInterfaceToken(ICameraService::getInterfaceDescriptor());

data.writeStrongBinder(cameraClient->asBinder());

data.writeInt32(cameraId);

remote()->transact(BnCameraService::CONNECT, data, &reply);

return interface_cast<ICamera>(reply.readStrongBinder());

}

 

frameworks/base/libs/camera/ICameraService.cpp

 

status_t BnCameraService::onTransact(

uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)

{

switch(code) {

case GET_NUMBER_OF_CAMERAS: {

CHECK_INTERFACE(ICameraService, data, reply);

reply->writeInt32(getNumberOfCameras());

return NO_ERROR;

} break;

case GET_CAMERA_INFO: {

CHECK_INTERFACE(ICameraService, data, reply);

CameraInfo cameraInfo;

memset(&cameraInfo, 0, sizeof(cameraInfo));

status_t result = getCameraInfo(data.readInt32(), &cameraInfo);

reply->writeInt32(cameraInfo.facing);

reply->writeInt32(cameraInfo.orientation);

reply->writeInt32(result);

return NO_ERROR;

} break;

case CONNECT: {

CHECK_INTERFACE(ICameraService, data, reply);

sp<ICameraClient> cameraClient = interface_cast<ICameraClient>(data.readStrongBinder());

sp<ICamera> camera =connect(cameraClient, data.readInt32());

reply->writeStrongBinder(camera->asBinder());

return NO_ERROR;

} break;

default:

return BBinder::onTransact(code, data, reply, flags);

}

}

frameworks/base/services/camera/libcameraservice/CameraService.cpp

sp<ICamera>CameraService::connect(

const sp<ICameraClient>& cameraClient, int cameraId) {

int callingPid = getCallingPid();

sp<CameraHardwareInterface> hardware = NULL;

LOG1("CameraService::connect E (pid %d, id %d)", callingPid, cameraId);

if (!mModule) {

LOGE("Camera HAL module not loaded");

return NULL;

}

sp<Client>client;

if (cameraId < 0 || cameraId >= mNumberOfCameras) {

LOGE("CameraService::connect X (pid %d) rejected (invalid cameraId %d).",

callingPid, cameraId);

return NULL;

}

char value[PROPERTY_VALUE_MAX];

property_get("sys.secpolicy.camera.disabled", value, "0");

if (strcmp(value, "1") == 0) {

// Camera is disabled by DevicePolicyManager.

LOGI("Camera is disabled. connect X (pid %d) rejected", callingPid);

return NULL;

}

Mutex::Autolock lock(mServiceLock);

if (mClient[cameraId] != 0) {

client = mClient[cameraId].promote();

if (client != 0) {

if (cameraClient->asBinder() == client->getCameraClient()->asBinder()) {

LOG1("CameraService::connect X (pid %d) (the same client)",

callingPid);

return client;

} else {

LOGW("CameraService::connect X (pid %d) rejected (existing client).",

callingPid);

return NULL;

}

}

mClient[cameraId].clear();

}

if (mBusy[cameraId]) {

LOGW("CameraService::connect X (pid %d) rejected"

" (camera %d is still busy).", callingPid, cameraId);

return NULL;

}

struct camera_info info;

if (mModule->get_camera_info(cameraId, &info) != OK) {

LOGE("Invalid camera id %d", cameraId);

return NULL;

}

char camera_device_name[10];

snprintf(camera_device_name, sizeof(camera_device_name), "%d", cameraId);

hardware = new CameraHardwareInterface(camera_device_name);

if (hardware->initialize(&mModule->common) != OK) {

hardware.clear();

return NULL;

}

client = new Client(this, cameraClient, hardware, cameraId, info.facing, callingPid);

mClient[cameraId] = client;

LOG1("CameraService::connect X");

return client;

}

mModule的实例化是在第一次该对象被应用的时候;

onFirstRef重载父类的方法。

void CameraService::onFirstRef()

{

BnCameraService::onFirstRef();

if (hw_get_module(CAMERA_HARDWARE_MODULE_ID,

(const hw_module_t **)&mModule) < 0) {

LOGE("Could not load camera HAL module");

mNumberOfCameras = 0;

}

else {

mNumberOfCameras = mModule->get_number_of_cameras();

LOGE("CameraService::onFirstRef mNumberOfCameras=%d",mNumberOfCameras);

if (mNumberOfCameras > MAX_CAMERAS) {

LOGE("Number of cameras(%d) > MAX_CAMERAS(%d).",

mNumberOfCameras, MAX_CAMERAS);

mNumberOfCameras = MAX_CAMERAS;

}

//for (int i = 0; i < mNumberOfCameras; i++) {

for (int i = 0; i < MAX_CAMERAS; i++) {

setCameraFree(i);

}

}

// Read the system property to determine if we have to use the

// AUDIO_STREAM_ENFORCED_AUDIBLE type.

char value[PROPERTY_VALUE_MAX];

property_get("ro.camera.sound.forced", value, "0");

if (strcmp(value, "0") != 0) {

mAudioStreamType = AUDIO_STREAM_ENFORCED_AUDIBLE;

} else {

mAudioStreamType = AUDIO_STREAM_MUSIC;

}

//set AudioStreamTyee  as "AUDIO_STREAM_ENFORCED_AUDIBLE"

mAudioStreamType = AUDIO_STREAM_ENFORCED_AUDIBLE;

}

frameworks/base/services/camera/libcameraservice/CameraService.cpp

CameraService::Client::Client(const sp<CameraService>& cameraService,

const sp<ICameraClient>& cameraClient,

const sp<CameraHardwareInterface>& hardware,

int cameraId, int cameraFacing, int clientPid) {

int callingPid = getCallingPid();

LOG1("Client::Client E (pid %d)", callingPid);

mCameraService = cameraService;

mCameraClient = cameraClient;

mHardware = hardware;

mCameraId = cameraId;

mCameraFacing = cameraFacing;

mClientPid = clientPid;

mMsgEnabled = 0;

mSurface = 0;

mPreviewWindow = 0;

mHardware->setCallbacks(notifyCallback,

dataCallback,

dataCallbackTimestamp,

(void *)cameraId);

// Enable zoom, error, focus, and metadata messages by default

enableMsgType(CAMERA_MSG_ERROR | CAMERA_MSG_ZOOM | CAMERA_MSG_FOCUS |

CAMERA_MSG_PREVIEW_METADATA);

// Callback is disabled by default

mPreviewCallbackFlag = CAMERA_FRAME_CALLBACK_FLAG_NOOP;

mOrientation = getOrientation(0, mCameraFacing == CAMERA_FACING_FRONT);

mPlayShutterSound = true;

cameraService->setCameraBusy(cameraId);

cameraService->loadSound();

LOG1("Client::Client X (pid %d)", callingPid);

}

(顺便可以看看析构函数的disconnect过程。)

经过上述过程,camera和camera::client之间形成了服务与用户的关系!

相机设置和控制

应用程序使用ICamera的Binder RPC请求更改相机设置或预览在应用程序中,调用setParameter()方法,将相应的设置参数以Binder RPC的形式传递给相机设备。

过程分析

(1)    应用层通过android.hardware.Camera的setParameter()方法,变更相机设置。

(2)    通过JNI调用android_hardware_Camera_setParameters()方法。

(3)    在android_hardware_Camera_setParameters()方法中调用Camera的setParameter()的成员方法。

(4)    Camera的setParameter成员方法通过BpCamera以Binder RPC的形式向BnCamera请求变更相机设置。

(5)    在CameraService::Client的setParameter()成员方法中,调用CameraHardwareInterface的setParameters()成员方法;

(6)    CameraHardwareInterface将设置内容更改到相机设备上。

app层中的代码如demo中所用,这里不再细看。

framewrok层

frameworks/base/core/java/android/hardware/Camera.java

public void setParameters(Parameters params) {

if ("invalid".equals(params.getFocusMode())) {

throw new RuntimeException("Throw exception for invalid parameters,Because FocusMode parameters was invalid.");

}

if ("invalid".equals(params.getFlashMode())) {

throw new RuntimeException("Throw exception for invalid parameters,Because FlashMode parameters was invalid.");

}

native_setParameters(params.flatten());

}

JNI层

frameworks/base/core/jni/android_hardware_Camera.cpp

................

{ "native_takePicture",

"(I)V",

(void *)android_hardware_Camera_takePicture },

{ "native_setParameters",

"(Ljava/lang/String;)V",

(void *)android_hardware_Camera_setParameters },

{ "native_getParameters",

"()Ljava/lang/String;",

(void *)android_hardware_Camera_getParameters },

{ "reconnect",

....................

 

frameworks/base/core/jni/android_hardware_Camera.cpp

static voidandroid_hardware_Camera_setParameters(JNIEnv *env, jobject thiz, jstring params)

{

LOGV("setParameters");

sp<Camera> camera = get_native_camera(env, thiz, NULL);

if (camera == 0) return;

const jchar* str = env->GetStringCritical(params, 0);

String8 params8;

if (params) {

params8 = String8(str, env->GetStringLength(params));

env->ReleaseStringCritical(params, str);

}

if (camera->setParameters(params8) != NO_ERROR) {

jniThrowRuntimeException(env, "setParameters failed");

return;

}

}

sp<Camera>get_native_camera(JNIEnv *env, jobject thiz, JNICameraContext** pContext)

{

sp<Camera> camera;

Mutex::Autolock _l(sLock);

JNICameraContext* context = reinterpret_cast<JNICameraContext*>(env->GetIntField(thiz, fields.context));

if (context != NULL) {

camera = context->getCamera();

}

LOGV("get_native_camera: context=%p, camera=%p", context, camera.get());

if (camera == 0) {

jniThrowRuntimeException(env, "Method called after release()");

}

if (pContext != NULL) *pContext = context;

return camera;

}

本地库

frameworks/base/libs/camera/Camera.cpp

status_t Camera::setParameters(const String8& params)

{

LOGV("setParameters");

sp <ICamera> c = mCamera;

if (c == 0) return NO_INIT;

return c->setParameters(params);

}

frameworks/base/libs/camera/ICamera.cpp

// set preview/capture parameters - key/value pairs

status_t setParameters(const String8& params)

{

LOGV("setParameters");

Parcel data, reply;

data.writeInterfaceToken(ICamera::getInterfaceDescriptor());

data.writeString8(params);

remote()->transact(SET_PARAMETERS, data, &reply);

return reply.readInt32();

}

frameworks/base/libs/camera/ICamera.cpp

status_t BnCamera::onTransact(

uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)

{

switch(code) {

//此处略去一些代码

case SET_PARAMETERS: {

LOGV("SET_PARAMETERS");

CHECK_INTERFACE(ICamera, data, reply);

String8 params(data.readString8());

reply->writeInt32(setParameters(params));

return NO_ERROR;

} break;

//此处略去一些代码

frameworks/base/services/camera/libcameraservice/CameraService.cpp

status_t CameraService::Client::setParameters(const String8& params) {

LOG1("setParameters (pid %d) (%s)", getCallingPid(), params.string());

Mutex::Autolock lock(mLock);

status_t result = checkPidAndHardware();

if (result != NO_ERROR) return result;

CameraParameters p(params);

return mHardware->setParameters(p);

}

mHardware存储着CameraHarewareInterface实例对象,该语句通过mHardware将设置应用到设备中。

相机事件处理

当相机设备发生事件时,相机服务使用ICameraClient的Binder RPC ,将其传递给应用程序。例如:应用程序在调用takePicture()函数获取静态图像时,相机设备准备好静态图像后,将以异步的方式将静态图像已经准备好的信息通知给应用程序。在这以过程中,首先发生Shutter事件,随后分别发生与RAW图像和JPEG图像相关的事件。

当应用程序调用android.hardware.Camera的takePicture()方法请求静态图像时,CameraService::Client::takePicture就会调用(如同在相机设置和控制中讲的一样,从应用层到本地库的调用过程不再赘述)。

Framework及本地库

frameworks/base/services/camera/libcameraservice/CameraService.cpp

status_t CameraService::Client::takePicture(int msgType) {

LOG1("takePicture (pid %d): 0x%x", getCallingPid(), msgType);

Mutex::Autolock lock(mLock);

status_t result = checkPidAndHardware();

if (result != NO_ERROR) return result;

if ((msgType & CAMERA_MSG_RAW_IMAGE) &&

(msgType & CAMERA_MSG_RAW_IMAGE_NOTIFY)) {

LOGE("CAMERA_MSG_RAW_IMAGE and CAMERA_MSG_RAW_IMAGE_NOTIFY"

" cannot be both enabled");

return BAD_VALUE;

}

// We only accept picture related message types

// and ignore other types of messages for takePicture().

int picMsgType = msgType

& (CAMERA_MSG_SHUTTER |

CAMERA_MSG_POSTVIEW_FRAME |

CAMERA_MSG_RAW_IMAGE |

CAMERA_MSG_RAW_IMAGE_NOTIFY |

CAMERA_MSG_COMPRESSED_IMAGE);

enableMsgType(picMsgType);

return mHardware->takePicture();

}

首先在CameraService::Client的takePicture()中启用相应类型的消息。当相机设备中发生CAMERA_MSG_SHUTTER,CAMERA_MSG_POSTVIEW_FRAME,CAMERA_MSG_RAW_IMAGE等事件时进行捕获。然后调用静态图像处理函数。

下面以CameraService::Client的handleShutter函数为例:

frameworks/base/services/camera/libcameraservice/CameraService.cpp

void CameraService::Client::handleShutter(void) {

if (mPlayShutterSound) {

mCameraService->playSound(SOUND_SHUTTER);

}

sp<ICameraClient> c = mCameraClient;

if (c != 0) {

mLock.unlock();

c->notifyCallback(CAMERA_MSG_SHUTTER, 0, 0);

if (!lockIfMessageWanted(CAMERA_MSG_SHUTTER)) return;

}

disableMsgType(CAMERA_MSG_SHUTTER);

mLock.unlock();

}

mCameraClient中保存BpCameraClient的实例对象,通过Binder RPC调用Camera的notifyCallback()函数处理发生的事件。disableMsgType()销毁CAMERA_MSG_SHUTTER消息。

frameworks/base/libs/camera/Camera.cpp

void Camera::notifyCallback(int32_t msgType, int32_t ext1, int32_t ext2)

{

sp<CameraListener> listener;

{

Mutex::Autolock _l(mLock);

listener = mListener;

}

if (listener != NULL) {

listener->notify(msgType, ext1, ext2);

}

}

在Camera的notifyCallback()函数中,调用CameraListener的notify()函数,将以Binder RPC传递过来的事件发送给应用程序。在变量mListenerz中保存着JNICameraContext的实例对象,它通过JNI在引用程序与Server Framework件传递数据。

android Camera framework层解析相关推荐

  1. android在framework层增加自己的service仿照GPS

    不少公司在开发android产品的时候,都需要在android中增加自己的service,尤其是定制的工业用途的设备,我们公司的项目中就涉及到要增加一个service,是一个北斗通信service,具 ...

  2. android在framework层增加自己的service---仿照GPS

         不少公司在开发android产品的时候,都需要在android中增加自己的service,尤其是定制的工业用途的设备,我们公司的项目中就涉及到要增加一个service,是一个北斗通信serv ...

  3. android 代码控制音量,Android的framework层音量控制原理分析--hot(key)处理

    Android.media.AudioManager中包含了对android.media.AudioService的跨进程AIDL调用封装. 正常处理过程: 1.调整音量是通过AudioManager ...

  4. 一牛网:Android camera驱动培训班,不可错过的学习机会

    Android camera驱动培训: 学员要求: 1.要有C语言基础 2.有android.linux.或其他嵌入式系统基础 3.对这块有强烈兴趣者. 师资团队: 主讲:一牛网_Jacky 老师(驱 ...

  5. Android 插件化原理解析——Activity生命周期管理

    之前的 Android插件化原理解析 系列文章揭开了Hook机制的神秘面纱,现在我们手握倚天屠龙,那么如何通过这种技术完成插件化方案呢?具体来说,插件中的Activity,Service等组件如何在A ...

  6. Android Camera TakePicture過程分析

    Android Camera TakePicture過程分析 接著上一篇文章,繼續講解camera拍照等具體功能實行流程 Camera子系统采用C/S架构,客户端和服务端在两个不同的进程当中,它们使用 ...

  7. Android Camera简单整理(一)-Camera Android架构(基于Q)

    Camera整体架构简单整理 一.Android Camera整体架构简述 1.1 Android Camera 基本分层 1.2 Android Camera工作大体流程 二. Camera App ...

  8. Android Camera:从零开发一款相机APP Day01:前景

    一.Android Camera开发前景: 1)camera相关应用的领域2)相关岗位介绍:3)市场招聘介绍:4)发展前景介绍: 二.学习这门课的重要性: 1)适合的人群:2)熟悉和了解Android ...

  9. Android Camera 架构

    和你一起终身学习,这里是程序员 Android 本篇文章主要介绍 Android 开发中的部分知识点,通过阅读本篇文章,您将收获以下内容: 一.Android Camera整体架构简述 二. Came ...

  10. Android Camera 五 Camera HAL v1

    Android Camera 一 源码路径 Android Camera 二 JNI JAVA和C/CPP图像数据传输流程分析 Android Camera 三 CameraService 和 Cli ...

最新文章

  1. OneFlow 概念清单
  2. 双击“本地连接”打不开无反应的解决方法
  3. 纳税服务系统【异常处理、抽取BaseAction】
  4. Wireshark的https代理抓包(whistle中间人代理)
  5. mysql5.5集群数据同步_[转]配置mysql5.5主从复制数据库集群
  6. 在ubuntu 12.04上安装tomcat 7.40
  7. 利用Python爬虫刷新某网站访问量
  8. 爬虫实例十二 沪深证券股票全站数据爬取
  9. 2022年计算机二级考试C语言程序设计冲刺题及答案
  10. 数字转换成金额大写的小程序
  11. php tp5生成条形码,TP5条形码
  12. [翻译]X窗口管理器的原理剖析(一)
  13. 云计算计算机二级,全国计算机等级考试二级MS+Office高级应用真题题库2+2020年3月-20210613095444.pdf-原创力文档...
  14. c语言公开课教案,9、祝福优质课一等奖教案
  15. Cesium解决传感器抖动问题
  16. SpringBoot/Spring Cloud/Docker
  17. 《JavaScript高级编程》HTML中的JavaScript
  18. 淘宝新开店铺容易忽略的地方,如何安全提升宝贝排名
  19. 【网络】应用层-HTTP协议
  20. 升级生产环境服务器 网卡驱动,解决断线无法重连问题

热门文章

  1. sharepoint安装心得-.net与sharepoint安装 sharepoint安装心得_过程(一)
  2. Docker系列教程27-在生产环境中使用Docker Compose
  3. 技巧:在Silverlight应用程序中操作Cookie
  4. Elasticsearch5.X Centos7安装过程
  5. Xcode的插件的路径
  6. 3-5-多数组k大值
  7. Android攻略--单位转化器UC--Units Converter(学习笔记)
  8. 打造基于hadoop的网站日志分析系统(5)之spark在日志分析系统里的简单应用
  9. Bean Validation 技术规范特性概述
  10. The method isEmpty() is undefined for the type String/String类型的isEmpty报错