一、关于Camera 方向 跟 LCD 方向的配置问题:

物理硬件上:
因为 sensor 是矩形(长方形),分长和宽,长宽比通常是 4:3,且 sensor 扫描方向都是按长边扫描,以2M芯片为例,芯片从 (1,1) 像素点开始曝光,依次到终点(1200,1600).

而 LCD 有两种扫描方式: 沿短边扫描(物理竖屏) 和 沿长边扫描(物理横屏)。

沿短边: (1,1) -> (800,600)

沿长边: (1,1) -> (600,800)

比如手机及平板,硬件上安装应该是保证物理上sensor的长宽方向和平板的长宽方向是一致的,比如手机是 600*800 分辨率, 平板是 800*600 分辨率,长宽方向不同,因此在手机设计时需要做旋转,使sensor长宽方向跟 LCD 一致。

(1) LCD沿长边扫描(物理横屏)时 LCD (从左到右)与camera 扫描方向一致 (左到右)。

(2) LCD沿着短边扫描(物理竖屏)时,LCD 与camera 扫描方向垂直,camera 从上到下,LCD 从左到右。LCD预览图像需要将图像旋转 90°。如果安装错误,将camera扫描方向跟     LCD扫描方向一致,就会导致 LCD 预览图像被拉伸。

软件上:
手机后置摄像头一般加 90 度的旋转,前置一般加 270 °的旋转,framework 会从 HAL 层(HAL层从dtsi 返回)读取这个数值,然后在 surfaceflinger 处理
而对于平板,则不需要加旋转,我们在配置平板的前后摄像头 mount_angle 时直接设置为 0 即可

在手机竖屏时,sensor是竖直放置,即按长边扫描的方向在竖直方向上,如果闪光灯曝光不足,就会使得右半部分没法曝光,因此在LCD 上显示的时候,因此左半部分是正常,右半部分是曝光不足的

http://wenku.baidu.com/link?url=7VSHC5qF-1peOGWQOFMEVceDJkSvgHaCZBcSwCo715BNPP6Lhwd-gdEZdaed4PnuF9J294QV7VhVl-C2SjexqaniPqZLM7wQFFpD3Am_a1y

如果后置不加90°旋转,则预览是正常的,但拍出来的照片向左旋转90°,放在电脑上显示也是向左旋转90°

如果前置不加90°旋转,则预览是正常的,但拍出来的照片向右旋转90°,放在电脑上显示也是向右旋转90°----估计都是因为电脑是横屏的

Android 2.2是第一个包含了YUV格式解码器(YuvImage)的版本;在以前的版本中,必须手动完成解码。

Framework/base/core/java/android/hardware/camera.java

The orientation of the camera image. The value is the angle that the camera image needs to be rotated clockwise so it shows correctly on the display in its natural orientation. It should be 0, 90, 180, or 270.For example, suppose a device has a naturally tall screen. The back-facing camera sensor is mounted in landscape. You are looking at the screen. If the top side of the camera sensor is aligned with the right edge of the screen in natural orientation, the value should be 90. If the top side of a front-facing camera sensor is aligned with the right of the screen, the value should be 270.

竖屏拍照的照片,直接使用的话,会旋转90度,参考系统图库的代码,需要先查询mediascanner的orientation字段,然后应用再把角度旋转过来,这样显示就ok了

强制横屏 就可以了 android:screenOrientation="landscape"

--------------------------------------------------------------下面这个解释应该是正确的----------------------------------------------------------

不过根据游标查询媒体库的角度是不全面的,不能解决个别问题,比如调用系统相机通过媒体库uri生成大图,用这种方法每次都会返回0,但是实际是反转90 度。根据我的调查,

原因是媒体库的uri(格式“content:....”)并不是图片的物理路径,它是图片在媒体数据库中的查询地址,所以如果用媒体库生成大图的时候如果没有对转角值进行初始化设置

(我猜的 因为我生成大图就没有插入任何其他的元数据),默认就是0,而实际默认转角是根据硬件设置的, 平板电脑大多感觉都是横屏(反转90 为了默认适配横屏)。
解决方法就是根绝媒体库的uri(如“content:..../47.”)获取图片的物理地址(如 “/xx/xx/xx1545515.jpg”), 再根据物理地址通过EXIFInterface来获取图片的实际转角,

再根据获取的角度来使用Maxtri来转正。(如果你用媒体库 uri.getpath()获取转角  返回的就是0)

-------------------------------------------------------------------------------------------------------------------------------------------------

takePicture方法的最简单形式是将所有的参数都设置为null。尽管能够捕获照片,但是不能获得它的引用。因此,至少应该实现一种回调方 法。一种最安全的回调方法是Camera.PictureCallback.onPictureTaken。它确保会被调用,并且在压缩图像时被调用。为 了利用该方法,我们将在活动中实现Camera.PictureCallback,并添加一个onPictureTaken方法。

public class SnapShot extends Activity implementsSurfaceHolder.Callback, Camera.PictureCallback {public void onPictureTaken(byte[] data, Camera camera) {}

该onPictureTaken方法有两个参数:第一个是实际的JPEG图像数据的字节数组,第二个是捕获该图像的Camera对象的引用。由于给定了实际的JPEG数据,因此为了保存它,

只需要将其写入磁盘的某个位置。正如我们已经知道的那样,可以利用MediaStore指定它的位置和元数据。

当执行onPictureTaken方法时,可以调用Camera对象上的startPreview。当调用takePicture方法时预览已经自动暂停,并且这个方法会告诉我们,现在可以安全地重新启动它。

public void onPictureTaken(byte[] data, Camera camera) {Uri imageFileUri = getContentResolver().insert(Media.EXTERNAL_
CONTENT_URI, new ContentValues());try {OutputStream imageFileOS = getContentResolver().
openOutputStream(imageFileUri);imageFileOS.write(data);imageFileOS.flush();imageFileOS.close();}catch (FileNotFoundException e) {} catch (IOException e) {}camera.startPreview();
}

上述的代码片段向MediaStore中插入了一条新记录,并返回一个URI。然后,利用这个URI可以获得一个OutputStream,用于写入JPEG数据。这将在MediaStore指定的位置中创建一个文件,并将它链接到新的记录。如果后面想要更新存储在MediaStore记录中的元数据,可以利用一个新的ContentValues对象对记录进行更新。

ContentValues contentValues = new ContentValues(3);
contentValues.put(Media.DISPLAY_NAME, "This is a test title");

EXIF(Exchangeable Image File)是“可交换图像文件”的缩写,当中包含了专门为数码相机的照片而定制的元数据,可以记录数码照片的拍摄参数、缩略图及其他属性信息。

元数据是对数据资源的描述,英文名称是“Metadata”, 通常被解释为data about data,即关于数据的数据。 元数据是信息共享和交换的基础和前提,用于描述数据集的内容、 质量、表示方式、空间参考、管理方式以及数据集的其他特征。

Android Camera API2中采用CameraMetadata用于从APP到HAL的参数交互 : http://www.2cto.com/kf/201510/448174.html

Android camera HAL四个callback   : http://blog.csdn.net/kickxxx/article/details/19111005

Android Camera HAL V3 Vendor Tag及V1,V3参数转换  : http://blog.csdn.net/dfysy/article/details/42805929

二、YUV sensor AF 流程 :

首先看看YUV 怎么调用到对应的驱动:

06-03 19:13:37.869   314  4564 E mm-camera-sensor: csiphy_open:89 csiphy subdev name = /dev/v4l-subdev1 --- 前置 csiphy06-03 19:13:37.879   314  4564 E mm-camera-sensor: sensor_init:474 subdev name v4l-subdev1306-03 19:13:43.529   314  4742 E mm-camera-sensor: csiphy_open:89 csiphy subdev name = /dev/v4l-subdev0 --- 后置 csiphy06-03 19:13:43.539   314  4742 E mm-camera-sensor: sensor_init:474 subdev name v4l-subdev13             --- 为什么都是13,代码逻辑有问题,字符数组没有分先后将 slave->subdev_name 保存在 g_subdev_name 字符数组,因此只能保存到最后一次调用的,就是前置---但不影响正常使用,因为不会改变 slave->subdev_name

HAL -> initDefaultParameters ->

后置5M YUV : 从 FOCUS_MODES_MAP_YUV[AUTO,MARCO,FIXED,INFINITY] 解析,并取第一个数组元素(AUTO) 作为默认的 AF 模式,并调用 setFocusMode()设置

前置2M YUV :  FOCUS_MODES_MAP_FRONT[FIXED,INFINITY]  解析,并取第一个数组元素(FIXED) 作为默认的 AF 模式

以上2个camera 如果都没有获取到 focusMode ,则都设置 FOCUSE_MODE_FIXED 为默认 AF 模式

在setFocusMode 函数中 , 如果 m_pCapability->cam.sensor_format == CAM_FORMAT_YCBCR -> 调用 EXT_CAM_FOCUS ,嵌入下面指令

(1)CAM_INTF_NATIVE_CMD_FOCUS  --- YUV
(2)CAM_INTF_PARM_FOCUS_MODE   --- Bayer

通过 AddSetParmEntryToBatch(m_pParamBuf,cmd,sizeof(cmd),&info) -> memcpy(PONTER_OF(paramType,p_table),paramValue,paramLength) -> 拷贝到对应内存区域--->
在 QCameraStateMachine.cpp-> updateParameters() 设置完所有参数后设置 m_bNeedRestart = true , 然后在 QCameraStateMachine.cpp 根据 QCAMERA_SM_EVT_SET_PARAMS 调用-->
QCamera2HWI: m_parent->commitParametersChanges() -> {mParameters.commitParameters() , mParameters.setNumOfSnapshot(设置拍照帧数)} -> commitSetBatch() ->  m_pCamOpsTbl->ops->set_parms(camera_handle,m_ParamBuf),commitParamChanges() (commit change from tmp storage into param map)--> mm_camera.c ->mm_camera_set_parms() -> mm_camera_util_s_ctrl(CAM_PRIV_PARM) ->ioctl(VIDIOC_S_CTRL) -> 根据v4l2 ioctl 命令调用 camera.c -> camera_v4l2_s_ctrl() -> packet(MSM_CAMERA_SET_PARM) -> post to server.c -> mct_pipeline.c -> case MCT_EVENT_CONTROL_SET_PARM: mct_pipeline_process_set(data,pipeline) --> switch: data->command ->   case CAM_PRIV_PARM : (解析 HAL 层传来的 evt_id , 基于 V4L2_CID_PRIVATE_BASE 0x0800 0000 定义的命令 )  (1)如果pipeline 有子模块:info.stream_type = CAM_STREAM_TYPE_PREVIEW / CAM_STREAM_TYPE_RAW (RDI streaming) -> stream = mct_pipeline_get_stream(pipeline,&ifo)  (2)上面为什么只处理preview/raw 数据就不知道了  mct_pipeline_send_ctrl_events(pipeline,stream->stream_id,MCT_EVENT_CONTROL_SET_PARM) -> pipeline->send_event()   (打包信息有 MCT_EVENT_CONTROL_CMD / MCT_EVENT_DOWNSTREAM /identity(session_id | stream_id )) -> stream->send_event() -> 寻找 MCT_MODULE_FLAG_SOURCE 子模块src_module->process_event() -> module_sensor.c  module_sensor_module_process_event()->  switch event_ctrl->type     case : MCT_EVENT_CONTROL_SET_PARM (检查 mct_pipeline.c 打包的信息)module_sensor_event_control_set_parm() ->    (module_sensor.c如果是YUV sensor 传递 EXT_CAM_SET_AF_STATUS/EXT_CAM_GET_AF_STATUS 到 sensor.c 之后会在这等待返回值,并设置MCT_BUS_MSG_Q3A_AF_STATUS到mct_bus)     switch event_control->type         case: CAM_INTF_NATIVE_CMD_FOCUS / SENSOR_SET_NATIVE_CMD -> func_tbl.process(SENSOR_SET_NATIVE_CMD) --> sensor.c
->sensor.c  根据 SENSOR_SET_NATIVE_CMD 命令调用 sensor_set_native_cmd() ->ioctl(s_data->fd,VIDIOC_MSM_SENSOR_NATIVE_CMD,&cam_info--调用后可以从驱动返回 user 空间) ->msm_sensor.c -> sensor driver ->  xxx_set_af_mode --> AF_OCR/AF_MARCO 模式都写 xx_marco_af 寄存器,而 AF_AUTO 写 normal_af 寄存器 --> 打开camera就会设置 AF 模式

--------------------------------------(怎么知道调用到哪个驱动文件 ? 通过 v4l-subdev 子设备)-------------------------------

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
开机时server_process.c 遍历并调用每个module的 init 函数,对“sensor”模块,通过 module_sensor_init()--> s_module=mct_module_create( "sensor" )创建模块,设置模块session操作函数<br>s_module-><span style= "color: #ff0000;" >module_private</span> = <span style= "color: #ff0000;" >module_ctrl</span>,sensor_init_probe(<span style= "color: #ff0000;" >module_ctrl</span>)--> 寻找 group_id == <span style= "color: #0000ff;" >MSM_CAMERA_SUBDEV_SENSOR_INIT</span> 的驱动(msm_sensor_init.c)打开并返回文件句柄 <br><span style= "text-decoration: underline;" ><span style= "color: #0000ff; text-decoration: underline;" >sd_fd</span></span>-> sensor_probe(<span style= "color: #ff00ff;" >sd_fd</span>,sensor_libs[i],&sensor_probe_parms) 遍历前后sensor)调用到kernel -> cfgtype=CFG_SINIT_PROBE -> msm_sensor_init.c ->msm_sensor_driver_cmd() <br>->msm_sensor_driver_probe(cfg->cfg.setting) 处理从vendor 传递下来<span style= "color: #0000ff;" >slave_info</span>变量,该变量是从 vendor 驱动 xxx_lib.c 文件下的结构体 <span style= "color: #0000ff;" >sensor_lib_ptr</span>的成员<span style= "color: #0000ff;" > sensor_slave_info</span> 传递来
内容包括:{camera_id/slave_addr/addr_type/data_type/i2c_freq_mode/sensor_id_info}(包括 ID 地址寄存器和 sensor_ID 值) / power_up/down_setting_array 上下电信息 -> <br>在 msm_sensor_driver.c 解析 camera_id -> 根据 camera_id 取出 <span style= "color: #0000ff;" >g_ctrl[]</span> 静态数组对应元素<br>g_ctrl[] 数组什么时候填充 ??<br>-> module_init(msm_sensor_driver_init) -> msm_sensor_platform_probe(忘记了是按照 name 还是 compatible 来匹配了) --> platform_set_drv_data(pdev,s_ctrl) <br>将驱动对象 <span style= "color: #0000ff;" >s_ctrl</span> 保存到 pdev 驱动,s_ctrl->sensor_device_type = MSM_CAMRA_PLATFORM_DEVICE -设置设备类型为 platform 设备 ( I2C 设备则设置 MSM_CAMERA_I2C_DEVICE ) <br>--> msm_sensor_driver_parse() -> msm_sensor_driver_get_dt_data(<span style= "color: #0000ff;" >s_ctrl</span> 解析dtsi) ,msm_sensor_init_default_params(<span style= "color: #0000ff;" >s_ctrl</span>) -> <br>在这里根据设备类型是 platform/i2c 选择对应通信方式:<br>           <span style= "color: #0000ff;" >platform -- msm_sensor_cci_func_tbl  ,  I2C---msm_sensor_qup_func_tbl</span> <br>设置 v4l2 subdev ops : s_ctrl->sensor_v4l2_subdev_ops = &msm_sensor_subdev_ops 主要填充 ioctl 和 s_power 成员变量现在这个驱动就有了 ioctl 和 s_power 对应的操作函数.<br>因此对应的 sensor 是可以调用到 msm_sensor.c 驱动中的 ,并在<span style= "color: #0000ff;" > msm_sensor_driver_parse</span> 函数最后填充静态数组 g_sctrl[s_ctrl->id] = s_ctrl ; <br>
这样在 msm_sensor_driver_probe 就可以引用 <span style= "color: #0000ff;" >g_sctrl[]</span> 数组来获取对应的 camera 设备驱动对象 <span style= "color: #0000ff;" >s_ctrl</span>.在 probe 函数中首先填充 s_ctrl->func_tbl, 对应YUV sensor驱动的操作函数有: <br>/config/power/match_id/control/
{<br>power_info = &s_ctrl->sensordata->power_info;  <br>先获取开机时 probe 解析 dtsi 得到的上电信息,结合 vendor 传递下来的 power_setting 给 dtsi 设置的 LDO : SENSOR_VREG 类型排序,比如 <br>power_setting[i].seq_type == SENSOR_VREG &&power_setting[i].seq_val == CAM_VDIG ->去从 dtsi 解析保存好的 cam_vreg[num_vreg] 数组找到匹配 “cam_vdig” 的那个cam_vreg[j] <br>更新 : power_setting[i].seq_val = j ; 当我要设置 CAM_VDIG 只要选择 ctrl->cam_vreg[power_setting->seq_val] 就能获取对应的 LDO 并设置它<br>因为 vreg 和 gpio 类型不同,因此需要不同的数组来保存,而引用的时候才能方便快速调用。  <br>}
回到 msm_sensor_driver_probe <span style= "color: #0000ff;" >如果是 platform 设备</span>,调用 msm_sensor_driver_create_v4l_subdev(s_ctrl) -> camera_init_v4l2(&s_ctrl->pdev->dev -->dev,&session_id),<br>v4l2_subdev_init() ---> v4l2_dev->mdev->dev = dev , pvdev->vdev->v4l2_dev = v4l2_dev;  <br>v4l2_device_register(dev,pvdev->vdev->v4l2_dev) ;     -->  bus.c driver->driver.name = driver->name; -->v4l2_dev->dev = dev, 并将驱动 name 保存到 v4l2_dev->name[0]<br>video_register_device(pvdev->vdev.VFL_TYPE_GRABBR,-1) --> 注册 dev/videox 设备<br>Platform probe 过程:<br>msm_sensor_driver_platform_probe( struct  <span style= "color: #ff0000;" >platform_device *pdev</span>) -><br>(1)在这通过 kzalloc为 s_ctrl 申请内存 platform_set_drvdata(pdev,s_ctrl)->dev_set_drvdata(pdev->dev,s_ctrl)--->pdev->dev->p->driver_data = s_ctrl ; 保存驱动对象,方便以后引用<br>(2)s_ctr->sensor_device_type = MSM_CAMERA_PLATFORM_DEVICE;   s_ctrl->of_node = pdev->dev.of_node;<br>(3)msm_sensor_driver_parse(s_ctrl) --> 解析 dtsi 填充 s_ctrl<br>(4)pdev->id = s_ctrl->id ---- cell_id , s_ctrl->pdev = pdev;  s_ctrl->sensordata->power_info.dev = &pdev->dev; ---- 这是要干啥<br>
YUV 会调用上电然后返回 sensor.c 然后发送 probe_done 到 kernel , 告诉驱动,等待结束,就行了 。难道 bayer 格式在开机不调用上电? 真搞不懂,目前只找到打开才会上电
(如果是 I2C 设备,则调用 msm_sensor_driver_create_i2c_v4l_subdev ->  struct  i2c_client * client = <span style= "color: #ff0000;" >s_ctrl->sensor_i2c_client->client</span>; camera_init_v4l2(&client->dev,id);<br> client 在哪里设置?  ---- I2C probe 过程:<br> <span style= "color: #ff0000;" >msm_sensor_driver_i2c_probe</span>( struct  i2c_client *<span style= "color: #ff0000;" >client</span>, const  struct  i2c_device_id *<span style= "color: #ff0000;" >id</span>) ->i2c.h <span style= "color: #ff0000;" >i2c_set_clientdata</span>( struct  i2c_client *dev,( void *)s_ctrl--刚申请内存)--> <br> (1)dd.c  <span style= "color: #ff0000;" >dev_set_drvdata</span>( struct  device *---&client->dev,s_ctrl) ---> dev->p->driver_data = s_ctrl; ----->( struct  i2c_client*) client->dev->p->driver_data = s_ctrl;<br> (2)s_ctrl->sensor_device_type = MSM_CAMERA_I2C_DEVICE;   s_ctrl->of_node = client->dev.of_node; (设置设备为I2C设备,并把 dtsi 解析得到的 of_node 节点赋给驱动对象)<br> (3)msm_sensor_driver_parse(s_ctrl); ->{<br>  通过 kzalloc 给 s_ctrl->sensor_i2c_client 及 s_ctrl->msm_sensor_mutex 申请内存<br>   调用 msm_sensor_driver_get_dt_data(s_ctrl) 根据 of_node 获取cell_id -- 第几个camera --> msm_sensor_get_sub_module_index() -- 获取子设备信息, -><br>           分别解析 vreg 及 gpio 信息 , --> 调用 msm_camera_get_dt_camera_info() 获取 isp 信息(INT/EXT/SOC/NULL) 有无 CALMEM(FROM/EERP/OTP) 版本号 companion ois<br>   mutex_init(s_ctrl->msm_sensor_mutex)<br>   msm_sensor_init_default_params(s_ctrl) -> 通过 kzalloc 给 s_ctrl->sensor_i2c_client-> cci_client 分配内存,根据不同平台分配不同操作函数 <br>         platform: cci_client->cci_subdev=msm_cci_get_subdev(),s_ctrl->sensor_i2c_client->i2c_func_tbl =msm_sensor_cci_func_tbl ; I2C : msm_sensor_qup_func_tbl<br>   g_sctrl[s_ctrl->id ---- cell_idx] = s_ctrl;<br>   返回 msm_sensor_driver_i2c_probe(<span style= "color: #ff0000;" >*client,*id</span>) --- 因为是 I2C 设备,<span style= "color: #ff0000;" >s_ctrl->sensor_i2c_client->client = client</span>; s_ctrl->sensordata->power_info.dev = &client->dev--干啥<br>)<br>
回到 ioctl :
s_ctrl->sensor_v4l2_subdev_ops = &msm_sensor_subdev_ops -> msm_sensor_subdev_core_ops -> {.ioctl = msm_sensor_subdev_ioctl,.s_power = msm_sensor_power } <br>因此每个 YUV sensor 现在就可以调到 msm_sensor.c
而 v4l2_subdev_init(s_ctrl->msm_sd.sd,s_ctrl->sensor_v4l2_subdev_ops) -> sd->ops = ops 将这个操作函数设置到 /dev/v4l-subdevX 设备中,上层打开 /dev/v4l-subdevx 就可以<br>调用到驱动: msm_sensor.c (camera_init_v4l2 是创建 /dev/videox 设备,填充的是 camera_v4l2_ops/camera_v4l2_ioctl_ops  v4l2 操作函数,供HAL层使用,而这里供 vendor 层调用)
s_ctrl->sensordata->sensor_name = slave_info->sensor_name;
snprintf( s_ctrl->msm_sd.sd.name, s_ctrl->sensordata->sensor_name)
s_ctrl->msm_sd.sd.entity.name = s_ctrl->msm_sd.sd.name
memcpy(slave_info->subdev_name, s_ctrl->msm_sd.sd.entity.name) --- 尼玛<br>在 msm.c —— msm_sd_register_subdev() 中 调用完 v4l2_device_register_subdev / _video_register_device() 之后设置 sd->entity.name = video_device_node_name(vdev)<br>这里更改了 sd.entity.name ,因此最终 subdev_name 应该对于 /dev/v4l-subdevX ----- 供 vendor 层调用

也是在 msm_sensor_init_default_params 中,设置 s_ctrl->func_tbl = &msm_sensor_func_tbl  --- 这个在 sensor.c 可以用来供 YUV 上电

-------------------------------------------------------------------------------------------------------------------------------

NV12和NV21属于YUV420格式,是一种two-plane模式,即Y和UV分为两个Plane,但是UV(CbCr)为交错存储,而不是分为三个plane。

其提取方式与上一种类似,即Y'00、Y'01、Y'10、Y'11共用Cr00、Cb00

YUV AF 流程:

1.app 调用auto focus , HAL 层运行 runAutoFocus 线程 (只处理 CAM_FOCUS_MODE_MARCO/CAM_FOCUS_MODE_AUTO,如果是CAM_FOCUS_MODE_IFINITY/FIXED/EDOF 则直接返回 fail 给 APP)
2.写快速 ae 寄存器 xxx_fast_aec_on[]
3.通过写读相关寄存器,获取当前场景 light level 值,当 cur_lux 小于临界值且 flash 模式是 auto 或者 flash 模式是on时,则设置 flash_status 为 true
4.如果 flash_status 是 true ,则写: FARST_AE_ON && PRE_FLASH_ON 两个寄存器,标记 is_preflash = 1 ,打开LED灯为 LED_LOW 模式,返回 SENSOR_AF_PRE_FLASH_ON 给HAL 层  如果 flash_status 是 false,则写AE/AWB 锁定寄存器 :   (1)如 AWB_AUTO/AWB_OFF,写 ae 及 awb lock/unlock    (2) 如 AWB 已经设定为某一种模式,则只写 AE lock/unlock最后写 af 寄存器 (寄存器的模式在打开camera就已经设置,这里也可以手动选择 Marco/normal AF), 返回 SENSOR_AF_START
5. HAL 判断如果当前 AF 模式为 SENSOR_AF_PRE_FLASH_ON == 3 && hw->mAFCancel == false (即没有取消对焦) ,设置 pre_flash = 1,并睡眠 400 ms,  并设置 SENSOR_AF_PRE_FLASH_AE_STABLE 到驱动。 在驱动写 0x7000 到 0x002c, 写 0x2c74 到 0x002E,然后从 0x0f12 读取值,当读到 0x01 时退出循环否则睡眠 30ms ,最多重复7次 (30*7 = 210ms ),从这里可以得出 ae 是否是 stable的。写 aec/awb 锁定寄存器(在对应函数做判断,如果没有写,才写,否则会重复),写 AF 寄存器
如果当前 AF 模式为  SENSOR_AF_START (返回状态跟 preflash 一样) ,则直接做下面操作6:
6. 调用 EXT_CAM_GET_AF_STATUS 到驱动 , 并在 HAL 设置 hw->AF_Status = 1    第一次先传入 0 : while(hw->AF_Status = 1 && timeout_cnt < 60 && hw>mAFCancel == 0) {lock; sensor_native_control_cam(); unlock;timeout_cnt++;usleep(1000*33);每次睡眠 33ms}直到 hw->AF_Status = 2 或 超时,这时候即使跳出 while 循环也还可以调用 af_cancel ,第二次传入 1 : 如果 hw->AF_Status == 2,进入这个判断,并且说明第一次 searching success. 设置 hw->AF_Status = 256while(hw->AF_Status == 256 && timeout_cnt <100 && hw->mAFCanecel == 0){lock; sensor_native_control_cam();unlock;timeout_cnt++;usleep(1000*35);}如果 hw->af_status == 0 从驱动写/读相关寄存器返回0 说明第二次 searching 也成功,将 status 设置为 AUTO_FOCUS_SUCCESS,否则设置AUTO_FOCUS_FAIL,如果第一次搜索失败,也有设置 AUTO_FOCUS_FAIL        调用 hw->sendEvtNotify(CAMERA_MSG_FOCUS,status,0)  返回 APP    到此,app -> auto_focus-> auto_focus_thread -> runAutoFocus 调用完成 

HAL 层怎么知道AF 信息返回:

在 module_sensor.c 等待 ioctl 调用到kernel并返回后,检查传入ioctl/或返回 ioctl 的参数 mode == EXT_CAM_SET_AF_STATUS / EXT_CAM_GET_AF_STATUS 则打包 MCT_BUS_MSG_Q3A_AF_STATUS 信息,并调用 mct_module.c mct_module_post_bus_msg(module,&bus_msg) {先通过 mct_pipeline_find_stream 获取 stream 再将 stream 强制转换成 mct_pipeline_t * 对象,并调用 pipeline->bus->post_msg_to_bus 在 HAL 解析 metadata 就可以调用 processAutoFocusEvent() 函数来处理,对于 YUV 格式,只修改 AF_Status 的值     

Auto flash capture 流程:

(1) 运行 AF 线程,并获得对焦成功的信号

05-27 15:44:55.043  5365  5365 D Camera4 : onKeyUp()
05-27 15:44:55.043  5365  5365 V Camera4 : handling onKeyUp event from Activity class
05-27 15:44:55.043  5365  5365 V Camera4 : isShowSwitchCameraAnimation : started - false, finished - true
05-27 15:44:55.053  5365  5365 V Camera4 : handleShutterKeyReleased
05-27 15:44:55.053  5365  5365 V CommonEngine: handleShutterReleaseEvent
05-27 15:44:55.053  5365  5365 I CommonEngine: handleShutterReleaseEvent - mFocusState: 0
05-27 15:44:55.053  5365  5365 V CommonEngine: scheduleTakePicture
05-27 15:44:55.053  5365  5365 V CommonEngine: scheduleStartPreview
05-27 15:44:55.053  5365  5365 V CameraSettings: getForcedShutterSound: 0
05-27 15:44:55.053  5365  5472 V CommonEngine: doAutoFocusAsync  ----- APP 执行对焦操作
05-27 15:44:55.053  5365  5472 E AXLOG   : Shot2Shot-Autofocus**StartU[1464331495064]**
05-27 15:44:55.053  5365  5472 V CommonEngine: stopResetTouchFocusTimer
05-27 15:44:55.083   418  1867 D CameraClient: autoFocus (pid 5365)
05-27 15:44:55.083   418  1867 D SecCameraCoreManager: autoFocus  ----------- 处理手指按下,在调用拍照之前,先调用自动对焦
05-27 15:44:55.083   418  1867 E QCamera2HWI: auto_focus
05-27 15:44:55.093   418  5596 E QCamera2HWI: [KPI Perf] preview_stream_cb_routine : E
05-27 15:44:55.093   418  5934 E QCamera2HWI: [KPI Perf] auto_focus_thread: E   --- 线程调用开始
05-27 15:44:55.093   418  5934 E QCamera2HWI: [KPI Perf] runAutoFocus: E
05-27 15:44:55.093   418  5934 E QCamera2HWI: AF runAutoFocus - getFocusMode =0 --- 线程会先获取当前的模式,0--- CAM_FOCUS_MODE_AUTO
05-27 15:44:55.093   418  5934 E QCamera2HWI: static int qcamera::QCamera2HardwareInterface::runAutoFocus(camera_device*): Start AF    --- 告诉驱动,可以开始 AF
05-27 15:44:55.403   418  5934 E QCamera2HWI: static int qcamera::QCamera2HardwareInterface::runAutoFocus(camera_device*): check AF status --- 已经lock ae/awb 且写完 AF 寄存器 ,花 310 ms
05-27 15:44:55.413   418  5934 E QCamera2HWI: static int qcamera::QCamera2HardwareInterface::runAutoFocus(camera_device*): AF_Status = 1 --- HAL 层初始化为1,等待它变为非1
...
05-27 15:44:55.703   418  5934 E QCamera2HWI: static int qcamera::QCamera2HardwareInterface::runAutoFocus(camera_device*): AF_status = 256 --- HAL 层初始化为256,等待它变为 0 ---> AF valid
... 写完 AF 寄存器到 AF 返回 valid 需要花 590ms ,从开始 AF 到 AF 返回有效值,总共需要花 930ms (这里是需要 preflash 的情况,否则不会花这么多)
05-27 15:44:56.023   418  5934 E QCamera2HWI: static int qcamera::QCamera2HardwareInterface::runAutoFocus(camera_device*): AF success  ------- 表明 AF 成功,可回调到 APP ,is_focus_valid == 1
05-27 15:44:56.033   418  5934 E QCamera2HWI: [KPI Perf] runAutoFocus: X, ret 0
05-27 15:44:56.033   418  5934 E QCamera2HWI: [KPI Perf] auto_focus_thread: X  --- 线程调用结束,AF 状态返回
05-27 15:44:56.043  5365  5365 V CommonEngine: AutoFocusCallback.onAutoFocus : msg[1] focusState[1]
05-27 15:44:56.043  5365  5365 E AXLOG   : Shot2Shot-Autofocus**EndU[1464331496048]**       // 56.043 - 55.053  =  990ms 耗不少时间05-27 15:44:56.033   418  5595 E QCameraHWI_Mem: cacheOpsInternal: addr = 0xacf7b000, fd = 87, handle = 0x1 length = 450560, ION Fd = 86  --- 应该是 preview 数据
05-27 15:44:56.033   418  5621 E QCameraStream: dataNotifyCB:
05-27 15:44:56.033   418  5621 E QCameraStream: processDataNotify:
05-27 15:44:56.033   418  5596 E QCameraStream: dataProcRoutine: Do next job
05-27 15:44:56.033   418  5596 E QCamera2HWI: [KPI Perf] preview_stream_cb_routine : E
05-27 15:44:56.043   418  5596 E QCameraHWI_Mem: displayBufferForSamsungApp: dequed buf hdl =0xb894ba30
05-27 15:44:56.043   418  5596 E QCameraHWI_Mem: displayBufferForSamsungApp: Found buffer in idx:0  --- 使用这个函数去显示

(2) 对焦成功后,APP调用拍照动作,首先设置参数及 enable 一些 MSG

05-27 15:44:56.043  5365  5472 E AXLOG   : Shot2Shot-TakePicture**StartU[1464331496051]**
05-27 15:44:56.053   418  2206 D CameraClient: sendCommand (pid 5365)
05-27 15:44:56.053   418  2206 D CameraClient: ENABLE_SHUTTER_SOUND (2, 1)
05-27 15:44:56.063  5365  5472 D Camera4 : startBlinkBlackScreenShutterAnimation
05-27 15:44:56.113  5365  5472 I CommonEngine: doTakePictureAsync - rotation : 90  ----- 旋转是 90°,表示后置摄像头
05-27 15:44:56.133   479  5624 E mm-camera: returning from demux as this is YUV sensor   ----- 来自 demux40.c  只要判断 chromatix_ptr 为空就返回
由 isp_pipeline.c  isp_pipeline_util.c 传递命令 ISP_HW_MOD_SET_TRIGGER_UPDATE 到 ISP 各个子模块, isp_pipeline_set_effect 会通过 for 循环调用所有 ISP 子模块
isp_pipeline_util_trigger_update 调用非 ISP_MOD_STATS 的其他子模块 (isp_hw_module_ops.h 定义 liearization/roloff/demux/bps/abf/asf/color/chroma/mce/sce/clf/wb/gamma/fov/scaler/bcc/stats/frame_skip)
isp_pipeline_util_update_af  只调用 ISP_MOD_STATS  ---- 因此这里有可能是Bayer 格式的 sensor 调用这里后返回,因为 YUV sensor不使用高通 ISP 模块,包括 stats
05-27 15:44:56.133   418  5509 E QCameraParameters: InCall is not set  {params.get("incall");
获取字符串,根据字符串判断是0/1 来调用 property_set("persist_camera.incall","true"/"false"}在 actuator.c 调用 property_get("persist_camera.incall",value,"0") 判断如果是正在打电话,则对应选择 SW landing 防止出现撞击05-27 15:44:56.153   418  5509 E QCameraParameters: [PARM_DBG] Requested preview size 800 x 480
05-27 15:44:56.153   418  5509 E QCameraParameters: [PARM_DBG] Requested video size 1280 x 720
05-27 15:44:56.153   418  5509 E QCameraParameters: [PARM_DBG] Requested picture size 2560 x 1536
05-27 15:44:56.153   418  5509 E QCameraParameters: [PARM_DBG] requested jpeg thumbnail size 480 x 288
05-27 15:44:56.153   418  5509 E QCameraParameters: [PARM_DBG] rotation val = 90
05-27 15:44:56.153   418  5509 E QCameraParameters: [PARM_DBG] m_bNoDisplayMode = 0  --- 显示 or 不显示
05-27 15:44:56.153   418  5509 E QCameraParameters: [PARM_DBG] ZSL = OFF
05-27 15:44:56.153   418  5509 E QCameraParameters: [PARM_DBG] Requested FpsRange Values:(15000, 30000)
05-27 15:44:56.153   418  5509 E QCameraParameters: setPreviewFrameRate: requested preview frame rate is 30
05-27 15:44:56.153   418  5509 E QCameraParameters: setShotMode: prev shotmode=1  --- AUTO
05-27 15:44:56.153   418  5509 E QCameraParameters: setLiveSnapshotSize For YCBCR sensor Live snap shot resolution size: width 2560 , height 1536                                                             liveshot 拍照大小,Bayer有自己的liveshot 数组
目前是对焦完成状态,下面拍照开始后涉及到状态机的转换:
QCAMERA_SM_STATE_STOPPED 是在构造函数中设置的,但在哪里调用这个构造函数呢?QCamera2HardwareInterface类是 QCamera2HWI 的友元类,初始化 m_stateMachine 对象就会调用构造函数cameraDeviceOpen(camera_id, struct hw_device_t **hw_device)->new QCamera2HardwareInterface(camera_id) -API1_0--> QCameraStateMachine;hw->openCamera(hw_device);
(写完 AF 寄存器,等待 AF 状态转换为 valid 即有效状态,并返回success 之后 APP 才会调用开始拍照,可以明显看到:对焦完成之后--到拍照完成--到返回预览,这段时间画面是静止不动的,无ZSL)
05-27 15:44:56.163   418  1867 D CameraClient: takePicture (pid 5365): 0x102
05-27 15:44:56.163   418  1867 D CameraClient: enableMsgType : msg(0x102, 0xd0f)
05-27 15:44:56.163   418  1867 D SecCameraCoreManager: enableMsgType : msg(In:0x102, Out:0xd0f)
05-27 15:44:56.163   418  1867 E QCamera2HWI: enable_msg_type : E, msg type 258
05-27 15:44:56.163   418  1867 E QCamera2HWI: waitAPIResult: wait for API result of evt (3) --- QCameraStateMachin.cpp 定义的 QCAMERA_SM_EVT_ENABLE_MSG_TYPE
05-27 15:44:56.163   418  5509 E QCamera2HWI: signalAPIResult: result->request_api=3 result->result_type=0 E
05-27 15:44:56.163   418  1867 E QCamera2HWI: waitAPIResult: return (0) from API result wait for evt (3) --- 0 应该也是 QCameraStateMachine.h 定义的
05-27 15:44:56.163   418  1867 E QCamera2HWI: enable_msg_type : X05-27 15:44:56.163   418  1867 D CameraClient: takePicture : playSound  --- 播放声音
05-27 15:44:56.163   418  1867 D CameraService: playSound(0)
05-27 15:44:56.223   418  1867 V MediaPlayer: start
05-27 15:44:56.223   418  1867 V MediaPlayerService: [8] setLooping(0)
05-27 15:44:56.223   418  1867 V MediaPlayerService: [8] setVolume(0.170000, 0.170000)
05-27 15:44:56.223   418  1867 V AudioSink: setVolume(0.170000, 0.170000)
05-27 15:44:56.223   418  1867 V MediaPlayerService: [8] setAuxEffectSendLevel(0.000000)
05-27 15:44:56.223   418  1867 V AudioSink: setAuxEffectSendLevel(0.000000)

(3) HAL 层处理拍照命令

05-27 15:44:56.223   418  1867 I ShotSingle: ShotSingle::takePicture start
05-27 15:44:56.223   418  1867 V ShotSingle: takePicture(0)
05-27 15:44:56.223   418  1867 E QCamera2HWI: take_picture - 当前是 QCAMERA_SM_STATE_PREVIEWING 状态,m_parent->preparePreview() && m_parent->startPreview() 后设置的,这里打开拍照线程
05-27 15:44:56.223   418  1867 I ShotSingle: ShotSingle::takePicture end05-27 15:44:56.233   418  5963 E QCamera2HWI: [KPI Perf] take_picture_thread: PROFILE_TAKE_PICTURE E ----- 创建拍照线程
05-27 15:44:56.233   418  5963 E QCamera2HWI: [KPI Perf] take_picture_internal: E
05-27 15:44:56.233   418  5963 E QCamera2HWI: take_picture_internal Flash Mode=0, LLS mode=0, Auto LLS mode=0, NumOfSnaps=1  ----flash_off /  只拍一张
05-27 15:44:56.233   418  5963 E QCamera2HWI: take_picture_internal: start capture   ------ 发送 QCAMERA_SM_STATE_PIC_TAKING 到 QCameraSM 并等待返回
05-27 15:44:56.233   418  5963 E QCamera2HWI: waitAPIResult: wait for API result of evt (19)
05-27 15:44:56.233   418  5509 E QCameraStateMachine: procEvtPreviewingState:1167 change m_state to 4 ---- 由 previewingState 转换到 QCAMERA_SM_STATE_PIC_TAKING 状态,并调用m_parent->takePicture
05-27 15:44:56.233   418  5509 E QCamera2HWI: [TS_DBG] takePicture: E mCameraId=0  ---> 由 QCameraSM 调用到这个函数  (YUV 没有 ZSL 模式,需要停止 preview )stopChannel(QCAMERA_CH_TYPE_PREVIEW) 分两步: (1)先停止HAL 每个 stream 线程 , (2)再停止 mm_camera_intf 的channel
delChannel(QCAMERA_CH_TYPE_PREVIEW) -> m_channels[QCAMERA_CH_TYPE_PREVIEW] = NULL;
(1)-> mStreams[i]->stop() -> mProcTh.exit() --> pthread_join(cmd_pid,NULL) , stream 线程退出,cmd_pid 是QCameraCmdThread.cpp 的公有成员 (先结束每个stream 数据回调处理线程)05-27 15:44:56.233   418  5509 E QCameraCmdThread: [DBG] exit: Before thread join  ---- 哪里调用释放线程呢 -- 上面的 stop ,而 start() 是启动线程
05-27 15:44:56.233   418  5595 E QCameraStream: static void* qcamera::QCameraStream::dataProcRoutine(void*): Exit ---- preview
05-27 15:44:56.233   418  5595 E QCameraStream: static void* qcamera::QCameraStream::dataProcRoutine(void*): X
05-27 15:44:56.243   418  5509 E QCameraCmdThread: [DBG] exit: After thread join05-27 15:44:56.243   418  5509 E QCameraCmdThread: [DBG] exit: Before thread join
05-27 15:44:56.243   418  5596 E QCameraStream: static void* qcamera::QCameraStream::dataProcRoutine(void*): Exit ---- metadata
05-27 15:44:56.243   418  5596 E QCameraStream: static void* qcamera::QCameraStream::dataProcRoutine(void*): X
05-27 15:44:56.243   418  5509 E QCameraCmdThread: [DBG] exit: After thread join(2)-> m_camOps->stop_channel() -> mm_channel_stop() 把 channel 里面的stream 相关资源释放[停止数据流,不注册到kernel,释放 super_buf,通过 put_buf 释放 stream 分配的buf]
调用 mm_camera_stream_off --------------- 处理 preview data
05-27 15:44:56.243   418  5509 E mm-camera-intf: mm_stream_streamoff: E, my_handle = 0x801, fd = 67, state = 6 -> ioctl(VIDIOC_STREAMOFF) -> camera.c MSM_CAMERA_PRIV_STREAM_OFF->mct_pipeline.c
05-27 15:44:56.253   479   479 E mm-camera: server_process_hal_event:__DBG__ E event id=3
05-27 15:44:56.253   479   479 E mm-camera: server_process_hal_event:__DBG__ X sucess event id=3
05-27 15:44:56.253   479  5550 E mm-camera: mct_pipeline_process_set:command=8000009 --  MSM_CAMERA_PRIV_STREAM_OFF = V4L2_CID_PRIVATE_BASE(0X8000000) + 9 -> HAL 根据 fd,到kernel转成 stream_id
05-27 15:44:56.253   479  5550 E mm-camera: mct_pipeline_process_set: stream_type = 1  ---------- 预览数据类型 mct_pipeline.c (MCT_EVENT_CONTROL_STREAMOFF & MCT_EVENT_DOWNSTREAM & MCT_EVENT_CONTROL_CMD) -> pipeline->send_event() -> mct_stream_send_event(数据流处理只有 FLAG_SOURCE 的子模块才调用)->
src_module->process(src_module,event) -> mct_module.c (mct_module_send_event()  {case: MCT_EVENT_DOWN_STREAM , dest_port -> event_func(dest_port,event) )
-> cpp_port.c ---> -- MCT_EVENT_CONTROL_STREAMOFF -- cpp_module.c -- cpp_module_process_downstream_event() -> cpp_module_event.c -> cpp_hardware.c
在 stream 里好像没有 cpp_module 调用到 FLAG_SOURC,那在哪里设置呢?
server_process.c ---- {"pproc", pproc_module_init, pproc_module_deinit}  --->  mct_module_create("pproc") 初始化一些子模块 cpp/vpe/c2d/cac/wnr/ 保存到 mod_private,并赋值给 pproc->priv 变量
--> 接着调用 pproc_module_create_default_ports(pproc) create pproc's stream/capture sinkports and source ports  --> 通过 pproc_port_init() 创建pproc module's sink/source ports
--> pproc_port_sink_check_caps_reserve -> pproc_poort_creae_stream_topology() 这里有判断如果是 CAM_STREAM_TYPE_PREVIEW/POSTVIEW 则调用 -> pproc_port_add_modules_to_stream() --->
设置 cac /wnr 为 MCT_MODULE_FLAG_SOURCE , cpp/vpe 为 MCT_MODULE_FLAG_SINK ,通过 mct_stream_link_modules 添加到 stream中, 如果只是 cpp/vpe 存在,则处理的 CAM_STREAM_TYPE_OFFLINE_PPROC 类型

CPP / ISP 停止数据流

05-27 15:44:56.253   479  5550 E mm-camera: cpp_module_handle_streamoff_event:1887, info: doing stream-off for identity 0x20002  ----- cpp_module_event.c
05-27 15:44:56.263   479  5550 E mm-camera: cpp_module_handle_streamoff_event:1933] iden:0x20002, linked_params:0x0
05-27 15:44:56.263   479  5550 E mm-camera: cpp_hardware_process_streamoff:537] skip_iden:0x0, duplicate_stream_status:0x0  ---- cpp_hardware.c
05-27 15:44:56.263   479  5550 E mm-camera: cpp_module_handle_streamoff_event:1944, info: stream-off done for identity 0x2000205-27 15:44:56.263   479  5550 E mm-camera: isp_streamoff: E, session_id = 2  ------> isp_port.c 根据     MCT_EVENT_CONTROL_STREAMOFF 来调用 isp_streamoff -> isp_proc_async_command
05-27 15:44:56.263   479  5522 E mm-camera: isp_proc_async_command: E ISP_ASYNC_COMMAND_STREAMOFF = 2
05-27 15:44:56.263   479  5522 E mm-camera: isp_proc_streamoff: E, session_id = 2, stream_id = 2, stream_type = 1
05-27 15:44:56.263   479  5623 E mm-camera: isp_axi_util_subscribe_v4l2_event: event_type = 0x8000100, is_subscribe = 0
05-27 15:44:56.293   479  5522 E mm-camera: isp_proc_async_command: X ISP_ASYNC_COMMAND_STREAMOFF = 2
05-27 15:44:56.293   479  5550 E mm-camera: isp_streamoff: X, session_id = 2
05-27 15:44:56.303   479  5550 E mm-camera: ispif_proc_streamoff: Enter       -----> port_ispif.c 根据  MCT_EVENT_CONTROL_STREAMOFF 来调用 ispif_streamoff() -> ispif_proc_streamoff()
05-27 15:44:56.303   479  5550 E mm-camera: ispif_proc_streamoff: Make ISPIF_CFG IOCTL!
05-27 15:44:56.303   479  5550 E mm-camera: ispif_proc_streamoff: ISPIF_CFG IOCTL returns!
05-27 15:44:56.303   479  5550 E mm-camera: ispif_proc_streamoff: X, rc = 0
05-27 15:44:56.303   479  5550 E mm-camera: release_isp_resource <------ ispif.c 在 ispif.c 对每个 stream都调用 ispif_util.c  ispif_util_release_isp_resource -> isp_resource_mgr.c
05-27 15:44:56.303   479  5550 E mm-camera: release isp0 rdi
05-27 15:44:56.303   479  5550 E mm-camera: release_isp_resource: isp_id = 0, camif_session_bit = 0x0
05-27 15:44:56.303   479  5550 E mm-camera-sensor: led_flash_process:129 CAM Flash Off  ----->  在 module_sensor_stream_off 里调用,是针对 Bayer格式sensor ,这里应该是没加宏导致
05-27 15:44:56.313   479  5550 E mm-camera: stop_sof_check_thread: Stopping/Joining SOF timeout thread  ---- 停止线程 mct_bus.c 接收 module_sensor.c 发送的 MCT_BUS_MSG_SENSOR_STOPPING05-27 15:44:56.313   418  5509 E mm-camera-intf: mm_camera_cmd_thread_stop: before join 0xac248930, qsize = 1  ---- 在 mm_stream_fsm_active() 根据 MM_STREAM_EVT_STOP,调用完 streamoff 后调用的
05-27 15:44:56.313   418  5621 E mm-camera-intf: mm_camera_cmd_thread: MM_CAMERA_CMD_TYPE_EXIT - cmd_pid = 0xac248930
05-27 15:44:56.313   418  5621 E mm-camera-intf: mm_camera_cmd_thread: X - cmd_pid = 0xac248930
调用 mm_camera_stream_off --- 处理 metadata
05-27 15:44:56.323   418  5509 E mm-camera-intf: mm_stream_streamoff: E, my_handle = 0x700, fd = 64, state = 6
05-27 15:44:56.323   479   479 E mm-camera: server_process_hal_event:__DBG__ E event id=3
05-27 15:44:56.323   479   479 E mm-camera: server_process_hal_event:__DBG__ X sucess event id=3
05-27 15:44:56.323   479  5550 E mm-camera: mct_pipeline_process_set:command=8000009
05-27 15:44:56.323   479  5550 E mm-camera: mct_pipeline_process_set: stream_type = 7  --------- 应该是 metadata 数据类型,这个类型总是第一个开始,最后一个停止05-27 15:44:56.323   418  5509 E mm-camera-intf: mm_camera_cmd_thread_stop: before join 0xacb2e930, qsize = 1 -- 停止 mm_camera_stream 的stream对象之后调用 mm_camera_cmd_thread_release 释放
05-27 15:44:56.323   418  5612 E mm-camera-intf: mm_camera_cmd_thread: MM_CAMERA_CMD_TYPE_EXIT - cmd_pid = 0xacb2e930
05-27 15:44:56.323   418  5612 E mm-camera-intf: mm_camera_cmd_thread: X - cmd_pid = 0xacb2e930调用 MM_STREAM_EVT_STOP / MM_STREAM_EVT_UNREG_BUF(ioctl(VIDIOC_REQBUFS)初始化一个stream ,如果是 V4L2_MOMERY_MMAP 则需要分配实际的内存) / MM_STREAM_EVT_PUT_BUF  -> 状态变化: active_stream_on(ioctl stream_off)-> state_reg (reqbufs->buffered) ->(init_bufs->) state_buffered (PUT_BUFS -> state_cfg) -> (deinit_bufs) state_cfg
(mm_stream_request_buf 和 mm_stream_unreg_buf 都调用 ioctl(VIDIOC_REQBUFS) )
释放内存:mm_channel_stop() -> mm_stream_deinit_bufs() 调用 mem_vbtl.put_bufs() -> QCameraStream::put_bufs()->QCameraStream::putBufs() ->  ---> ops_tble->unmap_ops() , mStreamBufs->deallocate()

释放内存:

05-27 15:44:56.343   418  5509 E QCameraHWI_Mem: deallocate: E
05-27 15:44:56.343   418  5509 E QCameraHWI_Mem: put buffer 0 successfully
05-27 15:44:56.343   418  5509 E QCameraHWI_Mem: cancel_buffer: hdl =0xb898e560
...
05-27 15:44:56.363   418  5509 E QCameraHWI_Mem: put buffer 6 successfully     ----- 一共释放 7 个preview buf
05-27 15:44:56.363   418  5509 E QCameraHWI_Mem:  deallocate : X 

添加拍照 channel :

addCaptureChannel() -> pChannel->init() , addStreamToChannel(CAM_STREAM_TYPE_SNAPSHOT)
(1) addStreamToChannel -> allocateStreamInfoBuf(streamType)
(开始计算不同格式需要分配的内存数量) :
05-27 15:44:56.383   418  5509 E QCameraHWI_Mem: alloc: use ION_SYSTEM_HEAP_ID memory: (33554432)
05-27 15:44:56.393   418  5509 E QCamera2HWI: getBufNumRequired : minCaptureBuffers=1, zslQBuffers=2, numStreamBuffer=8 numReprocBuffer = 7, minUndequeCount = 1
05-27 15:44:56.393   418  5509 E QCamera2HWI: getBufNumRequired : stream_type=7, bufferCnt=20   ----- metadata
05-27 15:44:56.393   418  5509 E QCameraParameters: [PARM_DBG] getFlipMode : the filp mode of stream type 7 is 0 .05-27 15:44:56.393   418  5509 E QCamera2HWI: getBufNumRequired : stream_type=3, bufferCnt=2    ----- snapshot 需要分配2个buf ,为啥是2
05-27 15:44:56.393   418  5509 E QCameraParameters: [PARM_DBG] getFlipMode : the filp mode of stream type 3 is 0 .(2) pChannel->addStream(streamCB,userData) -> pStream = new QCameraStream() , pStream->init() -> mCamOps->add_stream() ,     mCamOps->config_stream() 将数据流添加到channel 后同步 stream 信息到 server -->  mm_channel_config_stream()-> mm_stream_cfg() ->    mm_stream_sync_info() -> mm_camera_util_s_ctrl(CAM_PRIV_STREAM_INFO_SYNC) -> ioctl(fd,VIDIOC_S_CTRL) :05-27 15:44:56.403   479   479 E mm-camera: server_process_hal_event:__DBG__ E event id=3
05-27 15:44:56.403   479   479 E mm-camera: server_process_hal_event:__DBG__ X sucess event id=3
05-27 15:44:56.403   479  5550 E mm-camera: mct_pipeline_process_set:command=8000012  -------- > 8000000 + 18 -> CAM_PRIV_STREAM_INFO_SYNC添加各种模块到 stream 对象中:
(从 stream 中解析出session_id 以及 stream_info(包含 mode/num_of_burst/buf_planes/chromatix/num_bufs  等信息 ) )
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_start_link: Regular snapshot -- 表明不是 CAM_STREAMING_MODE_CONTINUOUS(连续表明属于 ZSL stream 且不需要链接到 pproc 模块)这个 mod 其实就是 server_process.c 的一个链表: modules_list[] = { "sensor",module_sensor_init,module_sensor_deinit},  ----- 在调用 module_sensor_init("sensor")时调用 mct_object_set_name(name) 将 name 保存到 object->name 中..."faceproc",module_faceproc_init,module_faceproc_deinit}} 在 server_process.c 调用 modules_list.init_mod(modules_list[i].name) 调用 每个module对于的 init 函数,比如 module_sensor_init(),然后将 返回值(mct_module_t * 类型)添加到 modules 这个链表中 --为啥加个while循环,如果返回空,则retry几次导致打不开
(初始化一些函数 set_mod / query_mod / start_session/ stop_session/ get_session_data ,其中 : s_module->module_private = module_ctrl; sensor_init_probe(module_ctrl) 找到 module_private 来源)

从 mod 链表中一个一个添加到 stream 中,根据 stream 及 sensor 类型,添加不同的高通子模块

mct_stream_start_link()->
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=sensor, name=sensor  --- sensor,通过遍历 mods 链表,寻找与 name 匹配的 mod (obj->name == name ?)     05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_get_module: module: 0xb7335f00
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=sensor, name=iface
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=iface, name=iface  --- iface,通过 iface->set_mod(FLAG_SOURCE / FLAG_INDEXABLE / FLAG_SINK 设置模块在 stream 中的角色 )05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_get_module: module: 0xb733ff70  --- isp
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=sensor, name=isp
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=iface, name=isp
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=isp, name=isp05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_get_module: module: 0xb7342648
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=sensor, name=stats  -- stats
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=iface, name=stats
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=isp, name=stats
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=stats, name=stats05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_get_module: module: 0xb7342970
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=sensor, name=pproc  ---pproc
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=iface, name=pproc
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=isp, name=pproc
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=stats, name=pproc
05-27 15:44:56.403   479  5550 E mm-camera: mct_stream_check_name:mod=pproc, name=pproc ----在寻找匹配的 mod (找子模块)找完子模块后,调用 mct_stream_link_modules() 链接到 stream中
if(pproc){mct_stream_link_modules(stream,sensor,isp,pproc) --- both bayer and YUV sensor add this func, Bayer && YUV 都需要经过 pproc mct_stream_link_modules(stream,isp,stats)  -----only bayer sensor add this func ,只有 Bayer 格式 sensor 需要经过 stats 模块
}
else{mct_stream_link_modules(stream,sensor,iface,isp,stats)  ---------Bayer ---- 需要经过 stats 模块mct_stream_link_modules(stream,sensor,iface,isp) -------YUV  , YUV sensor 不需要经过 stats 模块
}
-> mct_module_link(stream,mod1--> src,mod2-->dest)这样按先后顺序设置一个为stream的 src ,另一个为 dest .
然后调用 mct_stream_add_modules(stream,mod1/mod2)-> 设置 stream 为 mod1/mod2 的父节点,设置 mod1/mod2 为 stream 的子节点    05-27 15:44:56.413   479  5550 E mm-camera: cpp_module_notify_add_stream:1295, info: success, identity=0x20002 (表示 session_id == 2 后置摄像头) cpp_port.c -> CPP_PORT_TYPE_BURST/STREAMING
m_postprocessor.start(m_channels[QCAMERA_CH_TYPE_CAPTURE]);
05-27 15:44:56.413   418  5509 E QCamera2HWI: needReprocess: YUV Sensor: Reprocessing disabled   --- YUV sensor 不需要 reprocess
05-27 15:44:56.413   418  5509 E QCamera2HWI: startSnapshots --- QCameraPostProc -> m_cbNotifier.startSnapshots() -> 将 CAMERA_CMD_TYPE_START_DATA_PROC 命令告诉 QCameraCB 和 QCameraPostProc
05-27 15:44:56.413   418  5509 E QCameraChannel: QCameraChannel::start(): bundleInfo.num_of_streams=1  --- 只需要返回一张照片完事
05-27 15:44:56.413   418  5561 E QCamera2HWI: cbNotifyRoutine: get cmd 1  == CAMERA_CMD_TYPE_START_DATA_PROC (这个值就是1 ) 只是计算 numOfSnaoshhots / lognshotenabled ? / timershotenabled ?
05-27 15:44:56.413   418  5557 E QCameraPostProc: dataProcessRoutine: start data proc  ---> 为什么 这里也调用到 ?都是等待 cmdThread->cmd_sem 且都是 CAMERA_CMD_TYPE_START_DATA_PROC 的原因 ?主要设置 is_active == TRUE; needNewSess = true; 为 CAMERA_CMD_DO_NEXT_JOB , pme->m_ongoingJpegQ / pme->encodeData() 提供条件startChannel() -> start_channel(QCAMERA_CH_TYPE_CAPTURE); -> m_channels[ch_type] ->start() ->  在调用 addCaptureChannel 后就调用这个
(1) mStreams[i]->start() 启动 HAL 层数据流线程
(2)m_camOps->start_channel() - 重要,初始化 super_buf_queue , 分配内存,注册buf到kernel 其实就是调用 qbuf , 运行分发线程 并打开每个数据流    ioctl( VIDIOC_STREAMON) ->mct_pipeline.c 看下面
05-27 15:44:56.423   418  5509 E QCameraStream: getBufs: [MEM_DBG] Enter: mDynBufAlloc: 0
05-27 15:44:56.423   418  5970 E QCameraStream: dataProcRoutine: E
05-27 15:44:56.483   418  5509 E QCameraHWI_Mem: size before align: 449796
05-27 15:44:56.483   418  5509 E QCameraHWI_Mem: size after align: 450560
05-27 15:44:56.493   418  5509 E QCameraStream: getBufs: [MEM_DBG] frame_len: 449796, numBufAlloc: 20
05-27 15:44:56.493   418  5509 E QCameraStream: getBufs: [MEM_DBG] All mNumBufs: 20
05-27 15:44:56.493   418  5509 E QCameraStream: getBufs: [MEM_DBG] MEMORY ALLOC TIME : 75ms05-27 15:44:56.513   418  5509 E mm-camera-intf: mm_stream_request_buf: buf_num = 20, stream type = 7 --------metadata , addStreamToChannel()->allocateStreamInfoBuf() , getBufNumRequired()
05-27 15:44:56.523   418  5509 E QCameraHWI_Mem: cacheOpsInternal: addr = 0xacf51000, fd = 69, handle = 0x1 length = 450560, ION Fd = 68
... (将上面 20 个 metadata buf 进队到kernel )
打开 metadata 数据流
05-27 15:44:56.523   418  5509 E mm-camera-intf: mm_stream_streamon: E, my_handle = 0xa00, fd = 64, state = 6
05-27 15:44:56.523   479   479 E mm-camera: server_process_hal_event:__DBG__ E event id=3
05-27 15:44:56.523   479   479 E mm-camera: server_process_hal_event:__DBG__ X sucess event id=3
05-27 15:44:56.523   479  5550 E mm-camera: mct_pipeline_process_set:command=8000008
05-27 15:44:56.523   479  5550 E mm-camera: mct_pipeline_process_set: stream_type = 7   ------  metadata2个 snapshot buf:
05-27 15:44:56.523   418  5509 E QCameraStream: getBufs: [MEM_DBG] Enter: mDynBufAlloc: 0
05-27 15:44:56.523   418  5509 E QCameraHWI_Mem: alloc: use ION_SYSTEM_HEAP_ID memory: (33554432)
05-27 15:44:56.523   418  5509 E QCameraHWI_Mem: size before align: 5898240
05-27 15:44:56.523   418  5509 E QCameraHWI_Mem: size after align: 5898240
05-27 15:44:56.633   418  5509 E QCameraStream: getBufs: [MEM_DBG] frame_len: 5898240, numBufAlloc: 2
05-27 15:44:56.633   418  5509 E QCameraStream: getBufs: [MEM_DBG] All mNumBufs: 2
05-27 15:44:56.633   418  5509 E QCameraStream: getBufs: [MEM_DBG] MEMORY ALLOC TIME : 104 ms
05-27 15:44:56.633   479  5550 D mm-camera: mct_stream_create_buffers: plane idx = 0, offset 0, stride 2560, scanline = 1536
05-27 15:44:56.633   479  5550 D mm-camera: mct_stream_create_buffers: plane idx = 1, offset 0, stride 2560, scanline = 768
进buf进队到kernel空间
05-27 15:44:56.633   418  5509 E mm-camera-intf: mm_stream_request_buf: buf_num = 2, stream type = 3 ----------- snapshot 这个就是 getBufNumRequired() 返回的,YUV 格式强制设置 2
05-27 15:44:56.633   418  5509 E QCameraHWI_Mem: cacheOpsInternal: addr = 0xab638000, fd = 129, handle = 0x1 length = 5898240, ION Fd = 128
05-27 15:44:56.633   418  5509 E QCameraHWI_Mem: cacheOpsInternal: addr = 0xab098000, fd = 134, handle = 0x1 length = 5898240, ION Fd = 133
打开 snapshot 数据流
05-27 15:44:56.633   418  5509 E mm-camera-intf: mm_stream_streamon: E, my_handle = 0xb01, fd = 67, state = 6
05-27 15:44:56.633   479   479 E mm-camera: server_process_hal_event:__DBG__ E event id=3
05-27 15:44:56.633   479   479 E mm-camera: server_process_hal_event:__DBG__ X sucess event id=3
05-27 15:44:56.633   479  5550 E mm-camera: mct_pipeline_process_set:command=8000008
05-27 15:44:56.633   479  5550 E mm-camera: mct_pipeline_process_set: stream_type = 3  ------  snapshot
设置分辨率,这里需要调用到驱动文件:
{在module_sensor_stream_on() 根据 xxx_lib.c 文件 xxx_res_cfg[] 顺序: SENSOR_SET_NEW_RESOLUTION/ SENSOR_SET_CSIPHY/ SENSOR_CSID_CFG/ SENSOR_SEND_EVENT/ SENSOR_SET_START_STREAM/ SENSOR_GET_YUV_EXIF
有的 YUV sensor 在第一个数组加 SENSOR_SET_STOP_STREAM 按道理应该可以不需要吧,Bayer 格式在 SENSOR_SEND_EVENT 之前还有一个 SENSOR_LOAD_CHROMATIX 加载色彩调试}mct_stream_start_link() {根据 preview/video/snapshot/postview 来添加各种 mod ,比如 sensor 就被设置为 MCT_MODULE_FLAG_SOURCE (pproc/faceproc 也可以设置为 source mod ) } -> sensor->set_mod(source)
-> 就从 mct_stream.c 调用到 module_sensor.c 的 module_sensor_set_mod(MCT_MODULE_FLAG_SOURCE) 函数中,将 module_sensor_module_process_event() 赋值给 process_event 函数指针mm_camera_stream()-> ioctl( VIDIOC_STREAMON) ->camera.c  post(MCT_EVENT_CONTROL_STREAMON) -> mct_pipeline.c -> pipeline->send_event(MCT_EVENT_CONTROL_STREAMON)  -> mct_stream.c
-> mct_stream_send_event(){1.查看stream 有没有子stream,有的话强制转换 2.判断是否是MCT_MODULE_FLAG_SOURCE ---- 为啥 metadata 没有调用 port_sensor_handle_stream_on ,是因为在这里做判断,metadata 调用另一个函数  mct_stream_metadata_ctrl_event()3.是的话调用src_module->process_event
也就是 metadata 只在 mct_stream.c 通过 mct_steam_metabuf_find_bfr_mngr_subdev() 及 MSM_CAMERA_SUBDEV_BUF_MNGR 操作到 msm_generic_buf_mgr.c 返回 fd, 再通过 VIDIOC_MSM_BUF_MNGR_GET_BUF 调用到msm_generic_buf_mngr.c -> msm_buf_mngr_subdev_ioctl() -> msm_buf_mngr_get_buf() ->new_entry->vb2_buf = buf_mngr_dev->vb2_buf_ops.get_buf(session_id,stream_id)get_buf 在哪里定义 ?
msm_generic_buf_mgr.c -> v4l2_subdev_notify(MSM_SD_NOTIFY_REQ_CB,buf_mngr_dev->vb2_buf_ops) -> ,msm.c msm_sd_notify(arg) -> msm_vb2_request_cb(arg) -> arg->get_buf = msm_vb2_get_buf() 从 stream->queued_list 中获取 msm_vb2->vb2_buf ,并且这个buf 是 VB2_BUF_STATE_ACTIVE 且 不在 in_freeq 才返回 buf_info 然后与 mct_stream 里面的 stream->buffers.img_buf 链表一一对比,如果找到一样的 index 的buf,则返回,然后呢 ? 返回干嘛 ?
} 
当 module_sensor.c 接收到 MCT_EVENT_CONTROL_STREAMON 命令时,处理函数为上面注册的 module_sensor_module_process_event() -> 改状态为 SENSOR_START_PENDING 调用 port_sensor_handle_stream_on()下面是针对 snapshot 数据处理:
05-27 15:44:56.633   479  5550 E mm-camera-sensor: port_sensor_handle_stream_on:467 H/W revision = 2(2), Criterion ver = 0  --->   调用非 fast_aec_mode 的 stream_on 函数
05-27 15:44:56.633   479  5550 E mm-camera-sensor: module_sensor_stream_on:2091 ide 20002 SENSOR_START_STREAM  ----- 这个函数被调用
05-27 15:44:56.633   479  5550 W mm-camera-sensor: module_sensor_stream_on output format = 1       ------------ (处理 case : SENSOR_SET_START_STREAM)
05-27 15:44:56.633   479  5550 E mm-camera-sensor: module_sensor_is_ready_for_stream_on:1306 s_bundle->stream_on_count 0
05-27 15:44:56.633   479  5550 E mm-camera-sensor: module_sensor_is_ready_for_stream_on:1307 is_bundle_started 0  ----- is_fast_aec_mode_on == false 就调用这个函数
05-27 15:44:56.633   479  5550 E mm-camera-sensor: modules_sensor_set_new_resolution:315 SENSOR_SET_RESOLUTION 2560*1536 mask 8   --- (处理 case : SENSOR_SET_NEW_RESOLUTION 写snapshot 寄存器)
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_set_resolution:2484 width 2560, height 1536    (驱动写 LED_HIGH/Main_flash_on/snapshot_regs/wait_capture_stable/set_exif 写exif 寄存器)
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_set_resolution:2486 stream mask 8 hfr mode 0 fps 30.000000
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_pick_resolution:2000 requested aspect ratio 166
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_pick_resolution:2283 sup_asp_ratio =  133
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_pick_resolution:2321 preview i 0 x_output 2576 y_output 1932 max fps 15.000000 sup asp ratio 133
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_pick_resolution:2329 Requested size -- width = 2560 , height = 1536
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_pick_resolution:2333 pick res 0   CAM_STREAM_TYPE_SNAPSHOT  ------ 先设置当前模式拍拍照模式到 sensor 驱动
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_set_resolution:2605
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_set_resolution:2607 current_fps_div 1024   ------ 后设置 SENSOR_SET_RESOLUTION 到驱动05-27 15:44:56.653   479  5550 E mm-camera: receive MCT_EVENT_MODULE_SET_STREAM_CONFIG   -------- 在 module_sensor.c 设置这个命令,有很多模块可以处理这个命令 -> 比如 port_ispif.c
05-27 15:44:56.653   479  5550 E mm-camera: ispif_util_dump_sensor_cfg: sensor dim: width = 5152, heght = 1932, fmt = 10, is_bayer = 0 -> ispif.c isp_sink_port_config() ->ispif_util_dump_sensor_cfg
05-27 15:44:56.653   479  5550 E mm-camera: ispif_util_dump_sensor_cfg: camif_crop: first_pix = 0, last_pix = 5151, first_line = 0, last_line = 1931, max_fps = 15
05-27 15:44:56.653   479  5550 E mm-camera: reserve_camif_resource: is_ispif = 1, sess_idx = 0, fps = 15, num_isps = 1 op clk: 144000000
--> 用 use_pix 去检查是否用 dual VFE 还是用 RDI ,应该在 isp_resource_mgr.c / isp_hw.c 根据  stream_info->fmt 或者 sensor_cap->sensor_cid_ch[0].fmt 来决定05-27 15:44:56.653   479  5550 E mm-camera: isp_sink_port_stream_config: E, session_id = 2, stream_id = 2, stream_type = 3
05-27 15:44:56.653   479  5550 E mm-camera: isp_sink_port_stream_config: session_id = 2, stream_id = 2, is_split = 0
05-27 15:44:56.653   479  5550 E mm-camera: isp_sink_port_stream_config: old vfe_id_mask = 0x1, new vfe_id_mask = 0x105-27 15:44:56.673   479  5550 E mm-camera: cpp_module_handle_streamon_event:1761, identity=0x20002, stream-on done
05-27 15:44:56.673   479  5550 E mm-camera: isp_streamon: E, session_id = 2         ----- 由 port_isp.c 根据 MCT_EVENT_CONTROL_STREAMON 命令来调用
05-27 15:44:56.673   479  5522 E mm-camera: isp_proc_async_command: E ISP_ASYNC_COMMAND_STREAMON = 2
05-27 15:44:56.673   479  5522 E mm-camera: isp_proc_streamon: E, session_id = 2, stream_id = 2, stream_type = 3 num_bufs 2 --由 ispif_streamon 调用,这个又由 port_ispif 根据MCT_EVENT_CONTROL_STREAMON 调用
05-27 15:44:56.673   479  5624 E mm-camera: isp_hw_proc_set_recording_hint: recording_hint: 0  --- 非录像模式
05-27 15:44:56.673   479  5624 E mm-camera: demux_set_params: param_id 7, is not supported in this module
05-27 15:44:56.673   479  5624 E mm-camera: returning from demux as this is YUV sensor
05-27 15:44:56.673   479  5522 E mm-camera: isp_proc_streamon: set_all_saved_params done, session_id = 2
05-27 15:44:56.673   479  5522 E mm-camera: isp_proc_streamon: sending dim downstream done, session_id = 205-27 15:44:56.673   479  5624 E mm-camera: isp_axi_util_subscribe_v4l2_event: event_type = 0x8000100, is_subscribe = 1  --- 监听,关闭会设置 0
05-27 15:44:56.683   479  5522 E mm-camera: isp_proc_async_command: X ISP_ASYNC_COMMAND_STREAMON = 2
05-27 15:44:56.683   479  5522 E mm-camera: isp_proc_async_command: X, session_id = 2, async_cmd_id = 1
05-27 15:44:56.683   479  5550 E mm-camera: isp_streamon: X, session_id = 2

先调用 CFG_SET_START_STREAM 到驱动 (拍照模式不做处理,只有 recording 和 preview 模式才做处理),再调用 get_exif 获取 exif 信息
05-27 15:44:56.693   479  5550 E mm-camera-sensor: module_sensor_stream_on: get exif E -- YUV 格式的 sensor 才会调用,获取 exif 信息, 在 SENSOR_SET_RESOLUTION 中已经写了exif,现在只是获取
05-27 15:44:56.693   479  5550 E mm-camera-sensor: exp_time : 33 iso_value : 50
05-27 15:44:56.693   479  5550 E mm-camera-sensor: module_sensor_stream_on: get exif X
05-27 15:44:56.693   479  5550 E mm-camera: start_sof_check_thread: Starting SOF timeout thread  ----> 传 MCT_BUS_MSG_SENSOR_STARTING 到 mct_bus.c --------------------------------参考状态机--------------------------------------------------------------------------------------
05-27 15:44:44.583   418   418 E QCameraStateMachine: QCameraStateMachine:116 change m_state to 0 ---- 刚刚打开 camera 时 cameraDeviceOpen() ->new QCamera2HardwareInterface(camera_id) ->
05-27 15:44:44.583   418  5509 E QCameraStateMachine: smEvtProcRoutine: E            -> 初始化 m_stateMachine 对象成员时会调用对应类的构造函数 -> 创建smEvetProcRoutine 线程
05-27 15:44:47.123   418  5509 E QCameraStateMachine: procEvtPreviewStoppedState:408 change m_state to 2     --> stopped (0) -> preview (2)
05-27 15:44:56.233   418  5509 E QCameraStateMachine: procEvtPreviewingState:1167 change m_state to 4        --> preview (2) -> STATE_PIC_TAKING (4)
目前状态在这里 调用完 m_parent->takePicture() 等到返回后才继续调用 m_parent->signalAPIResult() -> QCamera2HWI.cpp ,因此触发signalAPIResult函数告诉 HAL 层 waitAPIResult 拍照事件结束返回05-27 15:44:57.153   418  5509 E QCameraStateMachine: [BS_DBG] QCAMERA_SM_EVT_SNAPSHOT_DONE - cancelPicture
05-27 15:44:57.253   418  5509 E QCameraStateMachine: procEvtPicTakingState:1787 change m_state to 0
05-27 15:44:57.253   418  5509 E QCameraStateMachine: [BS_DBG] QCAMERA_SM_STATE_PREVIEW_STOPPED
05-27 15:44:58.073   418  5509 E QCameraStateMachine: procEvtPreviewStoppedState:408 change m_state to 2
---------------------------------------------------------------------------------------------------------------------------------

拍照调用到底层结束并返回 HAL:

05-27 15:44:56.693   418  5509 E QCamera2HWI: takePicture: X mCameraId=0 --(stopChannel(preview) / delChannel(preview)/addCaptureChannel() / m_postprocessor.start()/startChannel(capture))结束
05-27 15:44:56.693   418  5509 E QCamera2HWI: signalAPIResult: result->request_api=19 result->result_type=0 E   ----> 19 -- QCAMERA_SM_EVT_TAKE_PICTURE , API_RESULT_TYPE_DEFAULT (0)
05-27 15:44:56.693   418  5963 E QCamera2HWI: waitAPIResult: return (0) from API result wait for evt (19)
05-27 15:44:56.693   418  5963 E QCamera2HWI: [KPI Perf] take_picture_internal: X, ret 0    ---->发送 QCAMERA_SM_EVT_TAKE_PICTURE 到 QCameraStateMachine -> takePicture 现在返回到这里
05-27 15:44:56.693   418  5963 E QCamera2HWI: playShutter in take_picture_internal ? LINE =1211  -----> 调用 playShutter() 告诉 APP 播放声音
05-27 15:44:56.693   418  5963 E QCamera2HWI: [KPI Perf] take_picture_thread: X     ------> 返回到 take_picture -> 返回APP
05-27 15:44:56.693   418  5561 E QCamera2HWI: cbNotifyRoutine: get cmd 3       -------> CAMERA_CMD_TYPE_DO_NEXT_JOB
05-27 15:44:56.693   418  5561 E QCamera2HWI: cbNotifyRoutine: cb type 0 cb msg type 2 received  -- /android/system/core/include/system/camera.h  CAMERA_MSG_SHUTTER = 0x00000002 // notifyCallback
05-27 15:44:56.693   418  5561 E QCamera2HWI: send QCAMERA_NOTIFY_CALLBACK (2)  ---- 也许就是拍照完成 ---> 56.633  SENSOR_START_STREAM ->  56.693 CallBack == 60ms 很快就返回
05-27 15:44:56.853   418  5561 E QCamera2HWI: return QCAMERA_NOTIFY_CALLBACK (2)   ---- 返回 APP 完成,耗时 160ms 05-27 15:44:56.833  5365  5380 D SecCamera-JNI-Java: postEventFromNative: 2
05-27 15:44:56.833  5365  5365 D SecCamera-JNI-Java: handleMessage: 2
05-27 15:44:56.833  5365  5365 V CommonEngine: ShutterCallback.onShutter
05-27 15:44:56.833   418   418 D CameraClient: sendCommand (pid 5365)
05-27 15:44:56.833   418   418 D CameraClient: ENABLE_SHUTTER_SOUND (1, 1)  ----- 播放声音05-27 15:44:56.833   418  5561 D CameraClient: disableMsgType : msg(0x2, 0xd0d)
05-27 15:44:56.833   418  5561 D SecCameraCoreManager: disableMsgType : msg(In:0x2, Out:0xd0d)
05-27 15:44:56.833   418  5561 I ShotCommon: setAppUsePreviewFrame(0)
05-27 15:44:56.833   418  5561 I ShotCommon: disablePreviewMsgBy : msg(In:0x2, Out:0x0)
05-27 15:44:56.833   418  5561 E QCamera2HWI: disable_msg_type : E, msg type 2
05-27 15:44:56.833   418  5561 E QCamera2HWI: waitAPIResult: wait for API result of evt (4)
05-27 15:44:56.843   418  5509 E QCamera2HWI: signalAPIResult: result->request_api=4 result->result_type=0 E  ----- QCAMERA_SM_EVT_DISABLE_MSG_TYPE05-27 15:44:56.853   418  5964 V AudioSink: deleteRecycledTrack_l
05-27 15:44:56.853   418  5964 V AudioSink: setVolume
05-27 15:44:56.863   418  5964 V AudioSink: open() DONE status 0
05-27 15:44:56.863   418  5964 V AudioSink: setPlaybackRate(1.000000 1.000000 0 0)
05-27 15:44:56.863   418  5964 V AudioSink: start   ---- 播放声音
05-27 15:44:56.863   418  5964 W AudioPolicyIntefaceImpl: Skipped to add effects on session 23

拍照数据线程返回,需要做编码处理:

05-27 15:44:56.903   418  5969 E QCameraHWI_Mem: cacheOpsInternal: addr = 0xab098000, fd = 134, handle = 0x1 length = 5898240, ION Fd = 133-->调用到 QCameraStream.h -> QCameraMem.h->mMemInfo[idx]
05-27 15:44:56.903   418  5972 E QCamera2HWI: [KPI Perf] capture_channel_cb_routine : E PROFILE_YUV_CB_TO_HAL  ----- 56.633 -> 56.903  =  270ms (SENSOR_START_STREAM 到 拍照数据返回 才是重点)
05-27 15:44:56.903   418  5975 E QCameraStream: dataNotifyCB:
05-27 15:44:56.903   418  5972 E QCamera2HWI: needReprocess: YUV Sensor: Reprocessing disabled  ----> processData()->m_parent->needReprocess()
05-27 15:44:56.903   418  5975 E QCameraStream: processDataNotify:
05-27 15:44:56.903   418  5972 E QCameraPostProc: processData: no need offline reprocess, sending to jpeg encoding --? NV16/NV21--RAW data 创建 jpeg_job 并放到 m_inputJpegQ 队列,do_next_job
05-27 15:44:56.903   418  5972 E QCamera2HWI: [KPI Perf] capture_channel_cb_routine: X
05-27 15:44:56.903   418  5971 E QCameraStream: dataProcRoutine: Do next job
05-27 15:44:56.903   418  5557 E QCameraPostProc: dataProcessRoutine: Do next job, active is 1--> startSnapshots 设置 active == 1,从 m_inputJpegQ取出放到 m_ongoingJpegQ 并调 pme->encodeData()
05-27 15:44:56.903   418  5557 E QCameraPostProc: encodeData : E
解析返回的 frame ,如果是 CAM_STREAM_TYPE_SNAPSHOT --> main_frame   , 如果是 CAM_STREAM_TYPE_PREVIEW / CAM_STREAM_TYPE_POSTVIEW  --->  thumbnail_stream
05-27 15:44:56.903   418  5557 E QCameraStream:  isOrignalTypeOf : stream_type 7, (METADATA), pp_type 0 (online), input_stream_type 0(default)   6--- CAM_STREAM_TYPE_YUV
05-27 15:44:56.903   418  5557 E QCameraStream:  isOrignalTypeOf : stream_type 7, pp_type 0, input_stream_type 0
05-27 15:44:56.903   418  5557 E QCameraStream:  isOrignalTypeOf : stream_type 7, pp_type 0, input_stream_type 0
05-27 15:44:56.903   418  5557 E QCameraPostProc: encodeData: frame id = 105-27 15:44:56.903   418  5557 E QCameraPostProc: getJpegEncodingConfig : E   ---> m_parent->getExifData()
05-27 15:44:56.903   418  5557 E QCamera2HWI: getExifData: getExifGpsProcessingMethod null
05-27 15:44:56.903   418  5557 E QCamera2HWI: getExifData: getExifLatitude null
05-27 15:44:56.903   418  5557 E QCamera2HWI: getExifData: getExifLongitude null
05-27 15:44:56.903   418  5557 E QCamera2HWI: getExifData: getExifAltitude null
05-27 15:44:56.903   418  5557 E QCamera2HWI: getExifData: GPS not enalbe
05-27 15:44:56.903   418  5557 E QCamera2HWI: EXIF - Thumbnail Size : 480x288
05-27 15:44:56.903   418  5557 E QCameraHWI_Mem: alloc: use ION_SYSTEM_HEAP_ID memory: (33554432)
05-27 15:44:56.903   418  5557 E QCameraHWI_Mem: size before align: 5898240   ---> m_pJpegOutputMem = new QameraStreamMemory() m_pJpegOutputMem -> allocate(1,main_offset,frame_len)
05-27 15:44:56.903   418  5557 E QCameraHWI_Mem: size after align: 5898240
05-27 15:44:56.973   418  5557 E QCameraPostProc: getJpegEncodingConfig : X---> mJpegClientHandle = jpeg_open(&mJpegHandle);
05-27 15:44:56.973   418  5557 E QCameraPostProc: [KPI Perf] encodeData : call jpeg create_session  ---> mJpegHandle.create_session(mJpegClientHandle,encodeParam,mJpegSessionId)
05-27 15:44:57.023   418  5557 E qomx_image_core: OMX_GetHandle:236] get instance pts is 0xb8948ab8
05-27 15:44:57.023   418  5557 D qomx_image_core: OMX_GetHandle:256] handle = b8948abc Instanceindex = 0,comp_idx 0 g_ptr 0xb88ee0b8
05-27 15:44:57.023   418  5557 E mm-still: OMX_ERRORTYPE qomx_component_set_callbacks(OMX_HANDLETYPE, OMX_CALLBACKTYPE*, OMX_PTR): 181: This ptr addr 0xb8948ab8
05-27 15:44:57.023   418  5557 E qomx_image_core: OMX_GetHandle:260] Success
05-27 15:44:57.023   418  5557 E QCameraPostProc:  encodeData : No thumbnail stream using main image!  ----- >  thumb_stream == null
05-27 15:44:57.023   418  5557 E QCameraPostProc: encodeData:1446] [LED_FLASH_DBG] frame id 1 buf_idx 0   ---->jpeg_job_data->src_frame->bufs[i]->stream_type == CAM_STREAM_TYPE_METADATA
05-27 15:44:57.023   418  5557 E QCameraPostProc: [KPI Perf] encodeData : PROFILE_JPEG_JOB_START         -----> mJpegHandle.start_job()
05-27 15:44:57.023   418  5557 E mm-jpeg-intf: mm_jpeg_start_job:1999] session_idx 0 client idx 0
05-27 15:44:57.023   418  5557 E QCameraPostProc: encodeData : X
05-27 15:44:57.033   418  5556 E mm-jpeg-intf: mm_jpeg_session_config_main_buffer_offset:619] yOffset = 0, cbcrOffset = (0 0), totalSize = 5898240,cbcrStartOffset = (3932160 1966080)
05-27 15:44:57.033   418  5556 E mm-jpeg-intf: mm_jpeg_encoding_mode:666] encoding mode = 1
05-27 15:44:57.053   418  5556 E jpeg_hw : open /dev/jpeg0: fd = 154
05-27 15:44:57.053   418  5556 E jpeg_hw : jpege_lib_init:390] handler 0x0 0x0 0x0
05-27 15:44:57.053   418  5556 E jpeg_hw : jpege_lib_init:430] Successful
05-27 15:44:57.053   418  5556 E mm-jpeg-intf: process_meta_data: exp_Time:33.000000, Br value:0.000000, shutter_speed:0.000000, iso:50 flash_mode 0 0  ----> 处理 metadata 数据
05-27 15:44:57.063   418  5556 E mm-jpeg-intf: mm_jpeg_session_config_thumbnail:860] encode_thumbnail 1
05-27 15:44:57.063   418  5556 E mm-jpeg-intf: mm_jpeg_configure_job_params:1432] Work buffer 57 0xad8db000 WorkBufSize 20000768
05-27 15:44:57.063   418  5978 E mm-still: virtual OMX_ERRORTYPE OMXJpegEncoder::startEncode():398] startEncode()
05-27 15:44:57.063   418  5978 E jpeg_hw : ioctl MSM_JPEG_IOCTL_RESET: rc = 0
05-27 15:44:57.063   418  5978 E jpeg_hw : jpege_lib_hw_config:568] result 0
05-27 15:44:57.063   418  5978 E jpeg_hw : jpege_lib_hw_config:573] Version 30020000
05-27 15:44:57.063   418  5978 E jpeg_hw : jpege_cmd_core_cfg: core cfg value = 10a005b
05-27 15:44:57.063   418  5978 E jpeg_hw : ioctl jpege_cmd_core_cfg: rc = 0
05-27 15:44:57.063   418  5978 E jpeg_hw : Input cbcr order = 1
05-27 15:44:57.063   418  5978 E jpeg_hw : FE_CFG: Num of input planes = 2, Memory format = 1  ------- 2个 planes Y CBCR
05-27 15:44:57.063   418  5978 E jpeg_hw : FE_CFG = a03b170
05-27 15:44:57.063   418  5978 E jpeg_hw : ioctl jpege_lib_hw_fe_cfg: rc = 0
05-27 15:44:57.063   418  5978 E jpeg_hw : jpege_lib_hw_fe_buffer_cfg:300] w 2560 h 1536 stride 2560 scanline 1536   ----- 长 和 宽 信息
05-27 15:44:57.063   418  5978 E jpeg_hw : jpege_lib_hw_fe_buffer_cfg: numofplanes = 2, chroma height = 768, chroma width = 2560
05-27 15:44:57.063   418  5978 E jpeg_hw : ioctl jpege_lib_hw_fe_buffer_cfg: rc = 0
05-27 15:44:57.063   418  5978 E jpeg_hw : ioctl jpege_lib_hw_we_cfg: rc = 0
05-27 15:44:57.063   418  5978 E jpeg_hw : we_buffer_cfg PLN0_WR_BUFFER_SIZE = 67128396
05-27 15:44:57.063   418  5978 E jpeg_hw : we_buffer_cfg PLN0_WR_STRIDE = 19532
05-27 15:44:57.063   418  5978 E jpeg_hw : we_buffer_cfg PLN0_WR_HSTEP = 19532
05-27 15:44:57.063   418  5978 E jpeg_hw : we_buffer_cfg PLN0_WR_VSTEP = 1024
05-27 15:44:57.063   418  5978 E jpeg_hw : ioctl jpege_lib_hw_we_buffer_cfg: rc = 0
05-27 15:44:57.063   418  5978 E jpeg_hw : InputFormat =3, Encode cfg = 83
05-27 15:44:57.063   418  5978 E jpeg_hw : width = 2560, padded_height = 1536 -->>>>>>>>>>  2560 * 1536 == 3932160   Requested picture size 2560 x 1536
---------------------------------------------------------
05-27 15:44:56.633   479  5550 E mm-camera-sensor: modules_sensor_set_new_resolution:315 SENSOR_SET_RESOLUTION 2560*1536 mask 8
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_set_resolution:2484 width 2560, height 1536
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_set_resolution:2486 stream mask 8 hfr mode 0 fps 30.000000
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_pick_resolution:2000 requested aspect ratio 166
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_pick_resolution:2283 sup_asp_ratio =  133
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_pick_resolution:2321 preview i 0 x_output 2576 y_output 1932 max fps 15.000000 sup asp ratio 133  --- 大小不匹配
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_pick_resolution:2329 Requested size -- width = 2560 , height = 1536
05-27 15:44:56.633   479  5550 E mm-camera-sensor: sensor_pick_resolution:2333 pick res 0   CAM_STREAM_TYPE_SNAPSHOT --- 这里不管大小匹配与否,只有 stream_mask 匹配就选择第一个
05-27 15:44:56.653   479  5550 E mm-camera: ispif_util_dump_sensor_cfg: sensor dim: width = 5152, heght = 1932, fmt = 10, is_bayer = 0   ----- ispif 安装 out_put 的大小设置
05-27 15:44:56.653   479  5550 E mm-camera: ispif_util_dump_sensor_cfg: camif_crop: first_pix = 0, last_pix = 5151, first_line = 0, last_line = 1931, max_fps = 15
05-27 15:44:56.653   479  5550 E mm-camera: reserve_camif_resource: is_ispif = 1, sess_idx = 0, fps = 15, num_isps = 1 op clk: 144000000
-----------------------------------------------------------
05-27 15:44:57.063   418  5978 E jpeg_hw : encoded_height = 95 , 5f, encoded_width = 159, 9f
05-27 15:44:57.063   418  5978 E jpeg_hw : nRegVal = 5f009f
05-27 15:44:57.063   418  5978 E jpeg_hw : ioctl jpege_lib_hw_encode_cfg: rc = 0
05-27 15:44:57.063   418  5978 E jpeg_hw : ioctl jpege_lib_hw_default_scale_cfg: rc = 0
05-27 15:44:57.063   418  5978 E jpeg_hw : ioctl jpeg_lib_hw_encode_state: rc = 0
05-27 15:44:57.063   418  5978 E jpeg_hw : ioctl jpege_lib_hw_read_quant_tables: rc = 0
05-27 15:44:57.063   418  5978 E jpeg_hw : jpege_lib_hw_config:720] success
05-27 15:44:57.063   418  5978 E jpeg_hw : jpege_lib_output_buf_enq:795] output_buf: 0x0xad8db000 enqueue 20000768, fd 57, result 0
05-27 15:44:57.063   418  5978 E jpeg_hw : jpege_lib_input_buf_enq:756] input_buf: 0x0xab098000 enqueue 3932160, offset 0,fd 134
05-27 15:44:57.063   418  5978 E jpeg_hw : jpege_lib_input_buf_enq:761] y_off=0x0 cbcr_off=0x0 num_of_mcu_rows=0,cr_len=0, cb_len =1966080
05-27 15:44:57.073   418  5980 E jpeg_hw : ioctl /dev/jpeg0: rc = 005-27 15:44:57.073   418  5978 E mm-still: bool QImageHybridEncoder::IsAvailable(QIEncodeParams&):203] scale input 2560x1536 crop (0, 0) output 480x288 rotation 0 --- HAL 层实际申请的
05-27 15:44:57.073   418  5978 E mm-still: virtual int QImageHybridEncoder::setEncodeParams(QIEncodeParams&):539] Rotation 0 Quality 35
05-27 15:44:57.073   418  5978 E mm-still: virtual int QImageHybridEncoder::addInputImage(QImage&):699] Y addr 0xab098000 len 3932160 fd 134   ---- Y 的地址
05-27 15:44:57.073   418  5978 E mm-still: virtual int QImageHybridEncoder::addInputImage(QImage&):717] CbCr addr 0xab458000 len 1966080 fd 134  ---- CbCr 的地址,两者之差 3C0000 == 3932160
05-27 15:44:57.083   418  5981 I mm-still: jpegw_configure,216
05-27 15:44:57.083   418  5981 I mm-still: jpegw_configure,227, p_mobicat_data->length: 0
05-27 15:44:57.083   418  5980 E jpeg_hw : jpege_lib_get_event:137] MSM_JPEG_IOCTL_EVT_GET rc = 0
05-27 15:44:57.083   418  5980 E jpeg_hw : jpege_lib_get_input:158] MSM_JPEG_IOCTL_INPUT_GET rc = 0
05-27 15:44:57.093   418  5980 E jpeg_hw : jpege_lib_get_output:185] MSM_JPEG_IOCTL_OUTPUT_GET rc = 0
05-27 15:44:57.133   418  5984 I mm-still: jpege_engine_hybrid_event_handler: job done
05-27 15:44:57.133   418  5981 E mm-still: void QImageHybridEncoder::Encode():1056] Encode done : output_size 8397
05-27 15:44:57.133   418  5981 E mm-still: virtual int OMXJpegEncoder::EncodeComplete(QImage*):557] Exif length: 10429
05-27 15:44:57.143   418  5978 E mm-still: virtual int QImageHybridEncoder::Stop():907] Stop
05-27 15:44:57.143   418  5978 E mm-jpeg-intf: mm_jpeg_fbd:2377] count 0
05-27 15:44:57.143   418  5978 E mm-jpeg-intf: [KPI Perf] : PROFILE_JPEG_FBD怎么返回到这个回调函数呢 ? --> openCamera->m_postprcess.init(jpegEvtHandle,this) -> mJpegCB = jpegEvtHandle -> encode_parm.jpeg_cb = mJpegCB ; -> mJpegHandle.create_session(encode_parm)
05-27 15:44:57.143   418  5978 E QCamera2HWI: [BS_DBG] jpegEvtHandle: Call jpeg callback through the state machine --> 非 lognshotenabled 的情况 QCAMERA_SM_EVT_JPEG_EVT_NOTIFY ->processjpegnotify()
05-27 15:44:57.143   418  5978 E mm-jpeg-intf: mm_jpeg_queue_remove_job_by_job_id:2550] found matching job id
05-27 15:44:57.143   418  5509 E QCameraPostProc: [KPI Perf] processJpegEvt : jpeg job 16777216     -->>>> m_postprcess.processJpegEvt
05-27 15:44:57.143   418  5509 E QCamera2HWI:  continueCACSave : CACSave 0  --- 将 CACSave 与 0 比较  看看是不是 burst 是不是需要 CACjpeg_mem = m_parent-> mGetMemory(evt->out_data.buf_filled_len) 根据获取编码后的大小分配内存   memcpy(jpeg_mem->data,evt->out_data.buf_vaddr,evt->out_data.buf_filled_len); 内存拷贝
05-27 15:44:57.153   418  5509 E QCameraPostProc: processJpegEvt : Calling upperlayer callback to store JPEG image, jpeg size=829272
05-27 15:44:57.153   418  5509 E QCameraPostProc: [KPI Perf] processJpegEvt: PROFILE_JPEG_CB  -- >>>>> sendDataNotify(CAMERA_MSG_COMPRESSED_IMAGE,jpeg_mem) ->>> QCameraPostProc::sendDataNotify
05-27 15:44:57.153   418  5509 E QCameraPostProc: releaseJpegJobData: E ------------- >>>> 释放 jpeg_mem 后 调用 releaseJpegJobData 释放 job
05-27 15:44:57.153   418  5561 E QCamera2HWI: cbNotifyRoutine: get cmd 3
05-27 15:44:57.153   418  5561 E QCamera2HWI: cbNotifyRoutine: cb type 3 cb msg type 256 received
05-27 15:44:57.153   418  5509 E QCameraHWI_Mem: cacheOpsInternal: addr = 0xacf51000, fd = 69, handle = 0x1 length = 450560, ION Fd = 68
05-27 15:44:57.153   418  5561 E QCamera2HWI: processSyncEvt: evt=30  ----------------------------------> 这里应该是调用 sendDataNotify() 给 HAL 之后返回了 ,下面才调用 release 释放内存块
05-27 15:44:57.153   418  5509 E QCameraHWI_Mem: cacheOpsInternal: addr = 0xab098000, fd = 134, handle = 0x1 length = 5898240, ION Fd = 133
05-27 15:44:57.153   418  5509 E QCameraPostProc: releaseJpegJobData: DEBUG : src_reproc_frame 1139
05-27 15:44:57.153   418  5509 E QCameraPostProc: releaseJpegJobData: X
-----在 processJpegEvt() -> sendDataNotify(CAMERA_MSG_COMPRESSED_IMAGE,jpeg_mem) -> m_parent->m_cbNotifier.notifyCallback(QCAMERA_DATA_SNAPSHOT_CALLBACK) -> sendCmd(CAMERA_CMD_TYPE_DO_NEXT_JOB) -> cbNotifyRoutine() -> pme->mParent->processSyncEvt(QCAMERA_DATA_SNAPSHOT_DONE) -> processEvt() ->m_stateMachine.procEvt() -> 处理 QCAMERA_DATA_SNAPSHOT_DONE 事件-> m_parent->cancelPicture();
05-27 15:44:57.153   418  5509 E QCameraStateMachine: [BS_DBG] QCAMERA_SM_EVT_SNAPSHOT_DONE - cancelPicture  ----- > 现在是 PIC_TAKING_STATE
05-27 15:44:57.153   418  5557 E QCameraPostProc: dataProcessRoutine: Do next job, active is 1
05-27 15:44:57.153   418  5509 E QCamera2HWI: cancelPicture: E mCameraId=0  --> 非HDR 模式先调用 m_postprocessor.stop() , 再调用 stopChannel(CAPTURE) delChannel(CAPtURE)
05-27 15:44:57.153   418  5509 E QCamera2HWI: stopSnapshots  ----》   stop() -> m_parent->mcbNotifier.stopSnapshots()05-27 15:44:57.153   418  5509 E QCameraPostProc: stop: __DEBUG__ before sendCmd
05-27 15:44:57.153   418  5509 E QCameraCmdThread: sendCmd: __DEBUG__ before waiting for sync_sem --> 发送 CAMERA_CMD_TYPE_STOP_DATA_PROC 到 QCamera2HWICackBacks.cpp,设置 isSnapshotActive = false
05-27 15:44:57.153   418  5557 E QCameraPostProc: dataProcessRoutine: stop data proc   ------------- > 因为等待相同信号,因此 dataProcRoutine 也会被调用到   ,设置 is_active = false
05-27 15:44:57.153   418  5557 E QCamera2HWI: setGPUMinClock : 6  ----- 设置 GPU 时钟
释放 jpeg_job  释放 m_pJpegOutputMem-> deallocate 申请的内存 , 清空 ongoingPPQ  /inputPPQ / JPEGQQ / RAWQ
05-27 15:44:57.163   418  5557 E jpeg_hw : jpege_lib_release:496] closed /dev/jpeg0
05-27 15:44:57.163   418  5557 E qomx_image_core: OMX_FreeHandle:331]
05-27 15:44:57.163   418  5557 E qomx_image_core: get_idx_from_handle:290] comp_idx 0 inst_idx 0
05-27 15:44:57.163   418  5557 E qomx_image_core: OMX_FreeHandle:366] Success
05-27 15:44:57.163   418  5557 E QCameraPostProc: dataProcessRoutine: initialize pme->mFramesRecvd
05-27 15:44:57.173   418  5509 E QCameraCmdThread: sendCmd: __DEBUG__ after waiting for sync_sem  ---- m_dataProcTh.sendCmd(CAMERA_CMD_TYPE_STOP_DATA_PROC)
05-27 15:44:57.173   418  5509 E QCameraPostProc: stop: __DEBUG__ after send Cmd       ----- 处理这个信号的都返回了? 不一定,只能说明这个函数调用完成了,不一定处理完
stopChannel(CAPTURE):
05-27 15:44:57.173   418  5509 E QCameraCmdThread: [DBG] exit: Before thread join                        05-27 15:44:57.173   418  5970 E QCameraStream: static void* qcamera::QCameraStream::dataProcRoutine(void*): Exit
05-27 15:44:57.173   418  5970 E QCameraStream: static void* qcamera::QCameraStream::dataProcRoutine(void*): X
05-27 15:44:57.173   418  5509 E QCameraCmdThread: [DBG] exit: After thread join
05-27 15:44:57.173   418  5509 E QCameraCmdThread: [DBG] exit: Before thread join05-27 15:44:57.173   418  5971 E QCameraStream: static void* qcamera::QCameraStream::dataProcRoutine(void*): Exit
05-27 15:44:57.173   418  5971 E QCameraStream: static void* qcamera::QCameraStream::dataProcRoutine(void*): X05-27 15:44:57.173   418  5509 E QCameraCmdThread: [DBG] exit: After thread join05-27 15:44:57.173   418  5509 E mm-camera-intf: mm_stream_streamoff: E, my_handle = 0xb01, fd = 67, state = 6  ---- 关闭数据流05-27 15:44:57.173   479   479 E mm-camera: server_process_hal_event:__DBG__ E event id=305-27 15:44:57.173   479   479 E mm-camera: server_process_hal_event:__DBG__ X sucess event id=305-27 15:44:57.173   479  5550 E mm-camera: mct_pipeline_process_set:command=800000905-27 15:44:57.173   479  5550 E mm-camera: mct_pipeline_process_set: stream_type = 305-27 15:44:57.173   479  5550 E mm-camera: cpp_module_handle_streamoff_event:1887, info: doing stream-off for identity 0x2000205-27 15:44:57.173   479  5550 E mm-camera: cpp_module_handle_streamoff_event:1933] iden:0x20002, linked_params:0x005-27 15:44:57.173   479  5550 E mm-camera: cpp_hardware_process_streamoff:537] skip_iden:0x0, duplicate_stream_status:0x005-27 15:44:57.173   479  5550 E mm-camera: cpp_module_handle_streamoff_event:1944, info: stream-off done for identity 0x2000205-27 15:44:57.173   479  5550 E mm-camera: isp_streamoff: E, session_id = 205-27 15:44:57.173   479  5522 E mm-camera: isp_proc_async_command: E ISP_ASYNC_COMMAND_STREAMOFF = 205-27 15:44:57.173   479  5522 E mm-camera: isp_proc_streamoff: E, session_id = 2, stream_id = 2, stream_type = 305-27 15:44:57.173   479  5623 E mm-camera: isp_axi_util_subscribe_v4l2_event: event_type = 0x8000100, is_subscribe = 005-27 15:44:57.183   479  5522 E mm-camera: isp_proc_async_command: X ISP_ASYNC_COMMAND_STREAMOFF = 205-27 15:44:57.183   479  5522 E mm-camera: isp_proc_async_command: X, session_id = 2, async_cmd_id = 205-27 15:44:57.183   479  5550 E mm-camera: isp_streamoff: X, session_id = 205-27 15:44:57.183   479  5550 E mm-camera: ispif_proc_streamoff: Enter05-27 15:44:57.183   479  5550 E mm-camera: ispif_proc_streamoff: Make ISPIF_CFG IOCTL!05-27 15:44:57.183   479  5550 E mm-camera: ispif_proc_streamoff: ISPIF_CFG IOCTL returns!05-27 15:44:57.183   479  5550 E mm-camera: ispif_proc_streamoff: X, rc = 005-27 15:44:57.183   479  5550 E mm-camera: release_isp_resource <------05-27 15:44:57.183   479  5550 E mm-camera: release isp0 rdi05-27 15:44:57.183   479  5550 E mm-camera: release_isp_resource: isp_id = 0, camif_session_bit = 0x005-27 15:44:57.183   479  5550 E mm-camera-sensor: led_flash_process:129 CAM Flash Off05-27 15:44:57.203   479  5550 E mm-camera: stop_sof_check_thread: Stopping/Joining SOF timeout thread05-27 15:44:57.203   418  5509 E mm-camera-intf: mm_camera_cmd_thread_stop: before join 0xab097930, qsize = 105-27 15:44:57.203   418  5975 E mm-camera-intf: mm_camera_cmd_thread: MM_CAMERA_CMD_TYPE_EXIT - cmd_pid = 0xab09793005-27 15:44:57.203   418  5975 E mm-camera-intf: mm_camera_cmd_thread: X - cmd_pid = 0xab09793005-27 15:44:57.203   418  5509 E mm-camera-intf: mm_stream_streamoff: E, my_handle = 0xa00, fd = 64, state = 605-27 15:44:57.203   479   479 E mm-camera: server_process_hal_event:__DBG__ E event id=305-27 15:44:57.203   479   479 E mm-camera: server_process_hal_event:__DBG__ X sucess event id=305-27 15:44:57.203   479  5550 E mm-camera: mct_pipeline_process_set:command=800000905-27 15:44:57.203   479  5550 E mm-camera: mct_pipeline_process_set: stream_type = 705-27 15:44:57.203   418  5509 E mm-camera-intf: mm_camera_cmd_thread_stop: before join 0xabcd6930, qsize = 105-27 15:44:57.203   418  5974 E mm-camera-intf: mm_camera_cmd_thread: MM_CAMERA_CMD_TYPE_EXIT - cmd_pid = 0xabcd693005-27 15:44:57.203   418  5974 E mm-camera-intf: mm_camera_cmd_thread: X - cmd_pid = 0xabcd693005-27 15:44:57.203   418  5509 E mm-camera-intf: mm_camera_cmd_thread_stop: before join 0xad0c9930, qsize = 105-27 15:44:57.203   418  5973 E mm-camera-intf: mm_camera_cmd_thread: MM_CAMERA_CMD_TYPE_EXIT - cmd_pid = 0xad0c993005-27 15:44:57.203   418  5973 E mm-camera-intf: mm_camera_cmd_thread: X - cmd_pid = 0xad0c993005-27 15:44:57.203   418  5509 E mm-camera-intf: mm_camera_cmd_thread_stop: before join 0xad1c8930, qsize = 105-27 15:44:57.203   418  5972 E mm-camera-intf: mm_camera_cmd_thread: MM_CAMERA_CMD_TYPE_EXIT - cmd_pid = 0xad1c893005-27 15:44:57.203   418  5972 E mm-camera-intf: mm_camera_cmd_thread: X - cmd_pid = 0xad1c893005-27 15:44:57.243   479   479 E mm-camera: server_process_hal_event:__DBG__ E event id=305-27 15:44:57.243   479   479 E mm-camera: server_process_hal_event:__DBG__ X sucess event id=305-27 15:44:57.243   479  5550 E mm-camera: mct_pipeline_process_set:command=800000b05-27 15:44:57.253   479   479 E mm-camera: server_process_hal_event:__DBG__ E event id=305-27 15:44:57.253   479   479 E mm-camera: server_process_hal_event:__DBG__ X sucess event id=305-27 15:44:57.253   479  5550 E mm-camera: mct_pipeline_process_set:command=800000b05-27 15:44:57.253   479  5550 E mm-camera: cpp_port_check_caps_unreserve:170, identity=0x2000205-27 15:44:57.253   479  5550 E mm-camera: cpp_port_check_caps_unreserve:179, identity=0x20002, unreserved05-27 15:44:57.253   418  5509 E QCamera2HWI: cancelPicture: X mCameraId=0 -------- cancelPicture 结束返回 05-27 15:44:57.253   418  5509 E QCameraStateMachine: procEvtPicTakingState:1787 change m_state to 0       ---> QCAMERA_SM_STATE_PREVIEW_STOPPED05-27 15:44:57.253   418  5509 E QCameraStateMachine: [BS_DBG] QCAMERA_SM_STATE_PREVIEW_STOPPED    --------- 调用 m_parent->signalEvtResult() 30--- QCAMERA_SM_EVT_SET_SNAPSHOT_DONE05-27 15:44:57.253   418  5509 E QCamera2HWI: signalEvtResult: result->request_api=30 result->result_type=0 E  05-27 15:44:57.253   418  5561 E QCamera2HWI: send QCAMERA_DATA_SNAPSHOT_CALLBACK (256) ----- 还在处理 QCAMERA_DATA_SNAPSHOT_CALLBACK 命令,通过 pms->mDataCb() 把它返回 app05-27 15:44:57.253   418  5561 I SecCameraCoreManager: __data_cb msg type 256 ---- 返回APP  ---- 在 sendDataNotify(camera_memory_t * data) 中将 内存data 保存到cbArg.data 中,返回APP
返回给 APP 之后,如果失败才调用 releaseNotifyData(daat_cb,this ) 释放通过 malloc 申请的内存块,该内存保存 传进来的已经编码好的数据地址 *data
否则 在 QCamera2HWICackBacks 处理 QCAMERA_DATA_SNAPSHOT_CALLBACK 命令结束之后,就好调用嵌入在参数里面的回调函数指针 cb->release_cb() 释放内存,真正实现在 QCameraPostProc.cpp.mDataQ 释放是用 releaseNotifications ,通过 mData(releaseNotifications,this) 初始化释放函数 m_dataFn ,在 Q.flush 就会调用到

FrameWork 打印 exif 元数据信息:

05-27 15:44:57.253   418  5561 E ShotCommon: parseDebugInfo05-27 15:44:57.253   418  5561 E ExifData: Parsing 9838 byte(s) EXIF data...05-27 15:44:57.253   418  5561 E ExifData: Found EXIF header.05-27 15:44:57.253   418  5561 E ExifData: Found EXIF header.05-27 15:44:57.253   418  5561 E ExifData: IFD 0 at 8.05-27 15:44:57.253   418  5561 E ExifData: Loading 13 entries...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x100 ('ImageWidth')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x101 ('ImageLength')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x10f ('Make')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x110 ('Model')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x112 ('Orientation')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x11a ('XResolution')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x11b ('YResolution')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x128 ('ResolutionUnit')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x131 ('Software')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x132 ('DateTime')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x213 ('YCbCrPositioning')...05-27 15:44:57.253   418  5561 E ExifData: Sub-IFD entry 0x8769 ('(null)') at 240.05-27 15:44:57.253   418  5561 E ExifData: Loading 25 entries...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x829a ('ExposureTime')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x829d ('FNumber')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x8822 ('ExposureProgram')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x8827 ('ISOSpeedRatings')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x9000 ('ExifVersion')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x9003 ('DateTimeOriginal')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x9004 ('DateTimeDigitized')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x9101 ('ComponentsConfiguration')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x9202 ('ApertureValue')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x9205 ('MaxApertureValue')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x9207 ('MeteringMode')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x9208 ('LightSource')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x9209 ('Flash')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0x920a ('FocalLength')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0xa000 ('FlashPixVersion')...05-27 15:44:57.253   418  5561 E ExifData: Loading entry 0xa001 ('ColorSpace')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0xa002 ('PixelXDimension')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0xa003 ('PixelYDimension')...05-27 15:44:57.263   418  5561 E ExifData: Sub-IFD entry 0xa005 ('(null)') at 659.05-27 15:44:57.263   418  5561 E ExifData: Loading 2 entries...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0x1 ('InteroperabilityIndex')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0x2 ('InteroperabilityVersion')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0xa217 ('SensingMethod')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0xa301 ('SceneType')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0xa402 ('ExposureMode')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0xa403 ('WhiteBalance')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0xa406 ('SceneCaptureType')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0xa420 ('ImageUniqueID')...05-27 15:44:57.263   418  5561 E ExifData: Sub-IFD entry 0x8825 ('(null)') at 690.05-27 15:44:57.263   418  5561 E ExifData: Loading 1 entries...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0x0 ('GPSVersionID')...05-27 15:44:57.263   418  5561 E ExifData: IFD 1 at 708.05-27 15:44:57.263   418  5561 E ExifData: Loading 9 entries...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0x100 ('ImageWidth')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0x101 ('ImageLength')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0x103 ('Compression')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0x112 ('Orientation')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0x11a ('XResolution')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0x11b ('YResolution')...05-27 15:44:57.263   418  5561 E ExifData: Loading entry 0x128 ('ResolutionUnit')...05-27 15:44:57.263   418  5561 E ExifData: Sub-IFD entry 0x201 ('JPEGInterchangeFormat') at 838.05-27 15:44:57.263   418  5561 E ExifData: Sub-IFD entry 0x202 ('JPEGInterchangeFormatLength') at 8992.05-27 15:44:57.263   418  5561 E exif_v  : parseDebugInfo w libexif phase 105-27 15:44:57.263   418  5561 E ShotCommon: no makernote tag in the exif IFD. We don't need to copy debug info.05-27 15:44:57.263   418  5561 D ShotSingle: CAMERA_MSG_COMPRESSED_IMAGE E05-27 15:44:57.263   418  5561 I ShotSingle: processDataCallback COMPRESSED_IMAGE size: 82927205-27 15:44:57.263   418  5561 D ShotSingle: sendDataCallbackToApp E05-27 15:44:57.263   418  5561 D ShotSingle: setMakerNoteToEXIF E05-27 15:44:57.273   418  5561 E ExifManager_SEC: Unload Exif05-27 15:44:57.273   418  5561 D ShotSingle: setMakerNoteToEXIF X
.....05-27 15:44:57.303   418  5561 D SecCameraCoreManager: processDataCallback : send data callback msg(0x100)05-27 15:44:57.303   418  5561 D CameraClient: handleCompressedPicture E05-27 15:44:57.303   418  5561 D CameraClient: disableMsgType : msg(0x100, 0xc0d)05-27 15:44:57.303   418  5561 D SecCameraCoreManager: disableMsgType : msg(In:0x100, Out:0xc0d)05-27 15:44:57.303   418  5561 I ShotCommon: setAppUsePreviewFrame(0)05-27 15:44:57.303   418  5561 I ShotCommon: disablePreviewMsgBy : msg(In:0x2, Out:0x0)05-27 15:44:57.303   418  5561 E QCamera2HWI: disable_msg_type : E, msg type 256 ---- 返回之后就要 disable 掉么 ?05-27 15:44:57.303   418  5561 E QCamera2HWI: waitAPIResult: wait for API result of evt (4)05-27 15:44:57.303   418  5509 E QCamera2HWI: signalAPIResult: result->request_api=4 result->result_type=0 E05-27 15:44:57.303   418  5561 E QCamera2HWI: waitAPIResult: return (0) from API result wait for evt (4)05-27 15:44:57.303   418  5561 E QCamera2HWI: disable_msg_type : X05-27 15:44:57.303   418  5561 I ShotCommon: disableMsgType : msg(In:0x100, Out:0xc0d)05-27 15:44:57.303   418  5561 D CameraClient: handleCompressedPicture X05-27 15:44:57.313   418  5561 D ShotSingle: sendDataCallbackToApp X05-27 15:44:57.313   418  5561 I ShotSingle: PictureFormat is Unlocked.05-27 15:44:57.313   418  5561 D ShotSingle: CAMERA_MSG_COMPRESSED_IMAGE X ------- 返回05-27 15:44:57.313   418  5561 E QCamera2HWI: return QCAMERA_DATA_SNAPSHOT_CALLBACK (256)  ------ 上面是send(256) 到 APP ,现在是调用结束,返回 HAL 层了,耗时 60ms 05-27 15:44:57.313   418  5561 E QCameraPostProc: releaseNotifyData: DEBUG : frame, 107205-27 15:44:57.313   418  5561 E QCamera2HWI: cbNotifyRoutine: get cmd 205-27 15:44:57.313  5365  5638 D SecCamera-JNI-Java: postEventFromNative: 25605-27 15:44:57.313  5365  5365 D SecCamera-JNI-Java: handleMessage: 25605-27 15:44:57.323  5365  5365 V CommonEngine: JpegPictureCallback.onPictureTaken05-27 15:44:57.353  5365  5365 V CommonEngine: setAEAWBLockParameter : false05-27 15:44:57.353  5365  5473 V CommonEngine: got message...{ when=-1ms what=3 target=com.sec.android.app.camera.engine.CommonEngine$StateMessageHandler }05-27 15:44:57.353  5365  5473 V CeStateInitialized: HandleMessage - 305-27 15:44:57.353  5365  5473 V CeRequestQueue: completeRequest05-27 15:44:57.353  5365  5473 D CeRequestQueue: [3 ]05-27 15:44:57.353  5365  5472 V CeRequestQueue: startFirstRequest05-27 15:44:57.353  5365  5472 V CeStateInitialized: HandleRequest - 305-27 15:44:57.353  5365  5472 V CommonEngine: doStartPreviewAsync  ---- 重新打开 preview 05-27 15:44:57.353  5365  5472 V CommonEngine: resetPreviewSize()- WH: 800 48005-27 15:44:57.413   418  1867 D CameraClient: setParameters (pid 5365)05-27 15:44:57.423  5365  5472 V CommonEngine: setJPEGThumbnailSize: 480 28805-27 15:44:57.423  5365  5472 V CommonEngine: setJpegThumbnailSize, mCurrentThumbnailWidth  : 480, mCurrentThumbnailHeight : 28805-27 15:44:57.463  5365  5365 V CommonEngine: setTouchAutoFocusActive : false05-27 15:44:57.423   418  1867 E QCamera2HWI: [KPI Perf] set_parameters: E -> hw->processAPI(QCAMERA_SM_EVT_SET_PARAMS,*parms) ->  m_parent->updateParameters()
05-27 15:44:57.423   418  1867 E QCamera2HWI: waitAPIResult: wait for API result of evt (6)
.... 设置很多参数,并等待返回
05-27 15:44:57.463   418  5509 E QCamera2HWI: signalAPIResult: result->request_api=6 result->result_type=0 E  ---  0 是 default
05-27 15:44:57.463   418  1867 E QCamera2HWI: waitAPIResult: return (0) from API result wait for evt (6)  ----- 6 QCAMERA_SM_EVT_SET_PARAMS
05-27 15:44:57.463   418  1867 E QCamera2HWI: [KPI Perf] set_parameters : X, ret 0

camera log分析相关推荐

  1. Camera Framework 分析

    Camera Framework 分析,本文主要介绍 Camera API2 相关. 类文件速查表 类文件目录 1 2 3 4 5 6 1. Framework Java API1:framework ...

  2. Android开机阶段log分析

    Android开机阶段log分析 标签(空格分隔): 开关机流程 Android开机阶段log分析 bootchart工具查看开机过程 打印开机过程各阶段的时间 kernel Init进程 Zygot ...

  3. Open Camera异常分析(一)

    负责的项目中遇到一些三方和其他的场景使用camera导致问题,并且没有及时释放camera device致使手机camera应用一直无法使用的严重问题,针对这类问题进行了一系列的分析与追踪,最后算是定 ...

  4. Camera API : Camera.getNumberOfCameras()分析

    文章目录 高通 Camera API: Camera.getNumberOfCameras()分析 背景 接口实现分析 为什么? 如何处理? 拓展 高通 Camera API: Camera.getN ...

  5. 掉网问题的log分析

    2019独角兽企业重金招聘Python工程师标准>>> 1.如何得知网络小区不支持紧急通话 => 查看modem log: PS    113821    37504    1 ...

  6. Apache的Access.log分析总结

    Apache的Access.log分析总结 #查看80端口的tcp连接 #netstat -tan | grep "ESTABLISHED" | grep ":80&qu ...

  7. Android 系统(104)---浅谈ANR及log分析ANR

    浅谈ANR及log分析ANR 一:什么是ANR ANR:Application Not Responding,即应用无响应 二:ANR的类型 ANR一般有三种类型: 1:KeyDispatchTime ...

  8. Android功耗(8)---Camera功耗分析和拆解

    一.Camera功耗分析和拆解 Q:如何判断camera功耗是否偏高? A:camera功耗包含平台基础值.屏幕.模组.马达.feature.算法.camera app等部分,我们的camera在MT ...

  9. Python学习之道-烤机测试日志Log分析统计

    Python学习之道-烤机测试日志Log分析统计 问题引出 一.环境准备 二.实践代码 1.初步实现 2.更新CSV文件写入统计结果 3.运行脚本 4.实现遍历多个Log并汇总结果到Excel 三.遇 ...

最新文章

  1. Pascal's Triangle
  2. android系统可以破吗,你的手机系统破到什么程度?一键查安卓漏洞
  3. PHP中一些可用的方法
  4. python decimal 转 float_python教程之二python数学运算
  5. EVEREST Ultimate Edition 4.50 Build 1330 Final
  6. zip4j实现文件压缩与解压缩 common-compress压缩与解压缩
  7. leetcode:N-Queens 问题
  8. 小米一键上锁工具_小米手机通用解锁教程
  9. java生成数据库三线表
  10. 一键去除照片水印—INpaint
  11. 数字取证wireshark流量分析
  12. esp32查询剩余内存_ESP32 Arduino教程:获取自由堆-esp文件
  13. (5)exec函数详解
  14. Python GUI项目:文件夹管理系统
  15. wamp+php+下载,wampserver 64位(php环境搭建安装包)下载_wampserver 64位(php环境搭建安装包)官方下载-太平洋下载中心...
  16. 2022年10月 基于WPF的智能制造MES系统框架-菜单栏的设计
  17. docker(七)容器与外部通信
  18. 应用在汽车倒车影像中的环境光传感芯片
  19. 关于直播类app中的推流、拉流技术
  20. js为什么设置为单线程,怎么实现多线程

热门文章

  1. 英雄联盟无限火力模式回归,一大半玩家来自中国。
  2. 独家丨花生大课堂白帆:用社交娱乐破题,重新定义“老年教育”
  3. ldap安装linux,linux安装ldap-server
  4. 自学前端,学到什么程度就可以找工作了?
  5. 文件存储服务RapidShare流量暴跌入困境
  6. 安装Docker及学习
  7. jQuery圆点图片轮播切换插件
  8. 12306能删候补订单记录_后补下单什么意思 12306候补下单怎么用
  9. 如何制作属于自己的图片马
  10. Java8/9 Optional使用