关键字:BigQuery

搜索结果共计:14

[单选]You write a Python script to connect to Google BigQuery from a Google Compute Engine virtual machine. The script is printing errors that it cannot connect to BigQuery.
What should you do to fix the script?

  • A

    Install the latest BigQuery API client library for Python

  • B

    Run your script on a new virtual machine with the BigQuery access scope enabled

  • C

    Create a new service account with BigQuery access and execute your script with that user

  • D

    Install the bq component for gcloud with the command gcloud components install bq.

答案:C

解析:

A - If client library was not installed, the python scripts won't run - since the question states the script reports "cannot connect" - the client library must have been installed. so it's B or C.
B - https://cloud.google.com/bigquery/docs/authorization an access scope is how your client application retrieve access_token with access permission in OAuth when you want to access services via API call - in this case, it is possible that the python script use an API call instead of library, if this is true, then access scope is required. client library requires no access scope (as it does not go through OAuth)
C - service account is Google Cloud's best practice
So prefer C.

收起解析

[单选]Your company is using BigQuery as its enterprise data warehouse. Data is distributed over several Google Cloud projects. All queries on BigQuery need to be billed on a single project. You want to make sure that no query costs are incurred on the projects that contain the data. Users should be able to query the datasets, but not edit them.
How should you configure users‘ access roles?.

  • A

    Add all users to a group. Grant the group the role of BigQuery user on the billing project and BigQuery dataViewer on the projects that contain the data.

  • B

    Add all users to a group. Grant the group the roles of BigQuery dataViewer on the billing project and BigQuery user on the projects that contain the data.

  • C

    Add all users to a group. Grant the group the roles of BigQuery jobUser on the billing project and BigQuery dataViewer on the projects that contain the data.

  • D

    Add all users to a group. Grant the group the roles of BigQuery dataViewer on the billing project and BigQuery jobUser on the projects that contain the data.

答案:C

解析:

Answer C
roles/bigquery.jobUser : Provides permissions to run jobs, including queries, within the project.
roles/bigquery.user: When applied to a dataset, this role provides the ability to read the dataset's metadata and list tables in the dataset.
When applied to a project, this role also provides the ability to run jobs, including queries, within the project. A member with this role can enumerate their own jobs, cancel their own jobs, and enumerate datasets within a project. Additionally, allows the creation of new datasets within the project; the creator is granted the BigQuery Data Owner role (roles/bigquery.dataOwner) on these new datasets.

收起解析

[单选]Your applications will be writing their logs to BigQuery for analysis. Each application should have its own table. Any logs older than 45 days should be removed. You want to optimize storage and follow Google-recommended practices. What should you do?

  • A

    Configure the expiration time for your tables at 45 days

  • B

    Make the tables time-partitioned, and configure the partition expiration at 45 days

  • C

    Rely on BigQuery‘s default behavior to prune application logs older than 45 days.

  • D

    Create a script that uses the BigQuery command line tool (bq) to remove records older than 45 days

答案:B

解析:

I think B is correct.
It looks like table will be deleted with option A.
https://cloud.google.com/bigquery/docs/managing-tables#updating_a_tables_expiration_time
When you delete a table, any data in the table is also deleted. To automatically delete tables after a specified period of time, set the default table expiration for the dataset or set the expiration time when you create the table.

收起解析

[单选]Your BigQuery project has several users. For audit purposes, you need to see how many queries each user ran in the last month. What should you do?

  • A

    Connect Google Data Studio to BigQuery. Create a dimension for the users and a metric for the amount of queries per user.

  • B

    In the BigQuery interface, execute a query on the JOBS table to get the required information.

  • C

    Use ‘bq show’ to list all jobs. Per job, use ‘bq ls ‘ to list job information and get the required information..

  • D

    Use Cloud Audit Logging to view Cloud Audit Logs, and create a filter on the query operation to get the required information.

答案:D

解析:

D- reasons:
1).-Cloud Audit Logs maintains audit logs for admin activity, data access and system events. BIGQUERY is automatically send to cloud audit log
functionality.
2).- In the filter you can filter relevant BigQuery Audit messages, you can express filters as part of the export
https://cloud.google.com/logging/docs/audit
https://cloud.google.com/bigquery/docs/reference/auditlogs#ids
https://cloud.google.com/bigquery/docs/reference/auditlogs#auditdata_examples

收起解析

[单选]Question #148
You are designing a Data Warehouse on Google Cloud and want to store sensitive data in BigQuery. Your company requires you to generate the encryption keys outside of Google Cloud. You need to implement a solution. What should you do?

  • A

    Generate a new key in Cloud Key Management Service (Cloud KMS). Store all data in Cloud Storage using the customer-managed key option and select the created key. Set up a Dataflow pipeline to decrypt the data and to store it in a new BigQuery dataset.

  • B

    Generate a new key in Cloud KMS. Create a dataset in BigQuery using the customer-managed key option and select the created key.

  • C

    Import a key in Cloud KMS. Store all data in Cloud Storage using the customer-managed key option and select the created key. Set up a Dataflow pipeline to decrypt the data and to store it in a new BigQuery dataset.

  • D

    Import a key in Cloud KMS. Create a dataset in BigQuery using the customer-supplied key option and select the created key.

答案:D

[单选]You are managing several projects on Google Cloud and need to interact on a daily basis with BigQuery,Bigtable,and Kubernetes Engine using the gcloud CLI tool.You are travelling a lot and work on different workstations during the week. You want to avoid having to manage the gcloud CLI manually What should you do?
您正在 Google Cloud 上管理多个项目,并且需要每天使用 gcloud CLI 工具与 BigQuery、Bigtable 和 Kubernetes Engine 进行交互。您一周经常出差并在不同的工作站上工作。您想避免手动管理 gcloud CLI,您应该怎么做?

  • A

    Install gcloud on all of your workstations.Run the command gcloud components auto-update on each workstation.
    在所有工作站上安装 gcloud。在每个工作站上运行命令 gcloud components auto-update。

  • B

    Create a Compute Engine instance and install gcloud on the instance.Connect to this instance via SSH to always use the same gcloud installation when interacting with Google Cloud.
    创建一个 Compute Engine 实例并在该实例上安装 gcloud。通过 SSH 连接到该实例以在与 GCP 交互时始终使用相同的 gcloud 安装。

  • C

    Use Google Cloud Shell in the Google Cloud Console to interact with Google Cloud.
    在 Google Cloud Console 中使用 Google Cloud Shell 与 Google Cloud 交互。

  • D

    Use a package manager to install gcloud on your workstations instead of installing it manually
    使用包管理器在您的工作站上安装 gcloud 而不是手动安装

答案:C

[多选]Your company implemented BigQuery as an enterprise data warehouse. Users from multiple business units run queries on this data warehouse. However, you notice that query costs for BigQuery are very high and you need to control costs. Which two methods should you use? (Choose two.)

  • A

    Split the users from business units to multiple projects.

  • B

    Apply a user- or project-level custom query quota for BigQuery data warehouse. | 为BigQuery数据仓库应用用户级或项目级的自定义查询配额。

  • C

    Create separate copies of your BigQuery data warehouse for each business unit.

  • D

    Split your BigQuery data warehouse into multiple data warehouses for each business unit.

  • E

    Change your BigQuery query model from on-demand to flat rate.Apply the appropriate number of slots to each Project. | 将BigQuery查询模型从按需速率更改为统一速率。对每个项目应用适当数量的插槽。

答案:B、E

[单选]Your organization needs to grant users access to query datasets in BigQuery but prevent them from accidentally deleting the datasets. You want a solution that follows Google-recommended practices.What should you do?

  • A

    Add users to roles/bigquery user role only, instead of roles/bigquery dataOwner.

  • B

    Add users to roles/bigquery dataEditor role only, instead of roles/bigquery dataOwner.

  • C

    Create a custom role by removing delete permissions, and add users to that role only.

  • D

    Create a custom role by removing delete permissions. Add users to the group, and then add the group to the custom role.

答案:A

[单选]If you have configured Stackdriver Logging to export logs to BigQuery, but logs entries are not getting exported to BigQuery, what is the most likely cause?

  • A

    The Cloud Data Transfer Service has not been enabled.

  • B

    There isn't a firewall rule allowing traffic between Stackdriver and BigQuery.

  • C

    Stackdriver Logging does not have permssion to write to the BigQuery dataset.

  • D

    The size of the Stackdriver log entries being exported exceeds the maximum capacity of the BigQuery dataset.

答案:C

[单选]Your company has a Google Cloud project that uses BigQuery for data warehousing on a pay-per-use basis. You want to monitor queries in real time to discover the most costly queries and which users spend the most. What should you do?

  • A

    1. In the BigQuery dataset that contains all the tables to be queried, add a label for each user that can launch a query. 2. Open the Billing page of the project. 3. Select Reports. 4. Select BigQuery as the product and filter by the user you want to check.

  • B

    1. Create a Cloud Logging sink to export BigQuery data access logs to BigQuery. 2. Perform a BigQuery query on the generated table to extract the information you need.

  • C

    1. Create a Cloud Logging sink to export BigQuery data access logs to Cloud Storage. 2. Develop a Dataflow pipeline to compute the cost of queries split by users.

  • D

    1. Activate billing export into BigQuery. 2. Perform a BigQuery query on the billing table to extract the information you need.

答案:B

[单选]Your company captures all web traffic data in Google Analytics 360 and stores it in BigQuery. Each country has its own dataset. Each dataset has multiple tables. You want analysts from each country to be able to see and query only the data for their respective countries. How should you configure the access rights?

  • A

    Create a group per country. Add analysts to their respective country-groups. Create a single group‘all_analysts’ ‘, and add all country-groups as members. Grant the ‘all_analysts’ group
    the IAM role of BigQuery jobUser. Share the appropriate dataset with view access with each respective analyst country-group.

  • B

    Create a group per country. Add analysts to their respective country-groups. Create a single group ‘all_analysts’ and add all country groups as members. Grant the‘all_analysts’ group the IAM role of BigQuery jobUser. Share the appropriate tables with view access with each respective analyst country-group.

  • C

    Create a group per country. Add analysts to their respective country-groups. Create a single group ‘all_analysts‘ and add all country- groups as members. Grant the ‘all_analysts ’ group the IAM role of BigQuery dataViewer. Share the appropriate dataset with view access.
    with each respective analyst country- group.

  • D

    Create a group per country. Add analysts to their respective country-groups. Create a single group‘all_analysts’and all country- groups as members. Grant the ‘all_analysts‘ group the IAM role of BigQuery dataViewer. Share the appropriate table with view access with each respective analyst country-group.

答案:A

解析:

It should be A. The question requires that user from each country can only view a specific data set, so BQ dataViewer cannot be assigned at project level. Only A could limit the user to query and view the data that they are supposed to be allowed to.

收起解析

[单选]Topic 1Question #122
You are working at a sports association whose members range in age from 8 to 30. The association collects a large amount of health data, such as sustained injuries. You are storing this data in BigQuery. Current legislation requires you to delete such information upon request of the subject. You want to design a solution that can accommodate such a request. What should you do?

  • A

    Use a unique identifier for each individual. Upon a deletion request, delete all rows from BigQuery with this identifier.

  • B

    When ingesting new data in BigQuery, run the data through the Data Loss Prevention (DLP) API to identify any personal information. As part of the DLP scan, save the result to Data Catalog. Upon a deletion request, query Data Catalog to find the column with personal information.

  • C

    Create a BigQuery view over the table that contains all data. Upon a deletion request, exclude the rows that affect the subject's data from this view. Use this view instead of the source table for all analysis tasks.

  • D

    Use a unique identifier for each individual. Upon a deletion request, overwrite the column with the unique identifier with a salted SHA256 of its value.

答案:B

解析:

According to me, the question states "The association collects a large amount of health data, such as sustained injuries." and the nuance on the word such => " Current legislation requires you to delete "SUCH" information upon request of the subject. " So from that point of view the question is not to delete the entire user records but specific data related to personal health data. With DLP you can use InfoTypes and InfoType detectors to specifically scan for those entries and how to act upon them (link https://cloud.google.com/dlp/docs/concepts-infotypes)
I would say B.

收起解析

[单选]Question #156
Your company has a Google Cloud project that uses BigQuery for data warehousing. They have a VPN tunnel between the on-premises environment and Google
Cloud that is configured with Cloud VPN. The security team wants to avoid data exfiltration by malicious insiders, compromised code, and accidental oversharing.
What should they do?

  • A

    Configure Private Google Access for on-premises only.

  • B

    Perform the following tasks: 1. Create a service account. 2. Give the BigQuery JobUser role and Storage Reader role to the service account. 3. Remove all other IAM access from the project.

  • C

    Configure VPC Service Controls and configure Private Google Access.

  • D

    Configure Private Google Access.

答案:C

解析:

IMO its C: VPC Service Control

收起解析

[单选]Topic 1 Question #179
Your company has a Google Cloud project that uses BigQuery for data warehousing. There are some tables that contain personally identifi able
information (PII).
Only the compliance team may access the PII. The other information in the tables must be available to the data science team. You want to minimize cost and the time it takes to assign appropriate access to the tables. What should you do?

  • A

    1). From the dataset where you have the source data, create views of tables that you want to share, excluding PII. 2). Assign an appropriate
    project-level IAM role to the members of the data science team. 3. Assign access controls to the dataset that contains the view.

  • B

    1). From the dataset where you have the source data, create materialized views of tables that you want to share, excluding PII. 2). Assign an
    appropriate project-level IAM role to the members of the data science team. 3. Assign access controls to the dataset that contains the view.

  • C

    1). Create a dataset for the data science team. 2). Create views of tables that you want to share, excluding PII. 3). Assign an appropriate
    project-level IAM role to the members of the data science team. 4). Assign access controls to the dataset that contains the view. 5). Authorize
    the view to access the source dataset.

  • D

    1). Create a dataset for the data science team. 2). Create materialized views of tables that you want to share, excluding PII. 3). Assign an
    appropriate project-level IAM role to the members of the data science team. 4). Assign access controls to the dataset that contains the view. 5).
    Authorize the view to access the source dataset.

答案:C

解析:

I would say C.
A is not limiting access to the dataset, which still contains PII.
C is creating a new dataset without PII.

收起解析

GCP认证考试之BigQuery专题相关推荐

  1. GCP认证考试之Storage专题

    关键字:Storage 搜索结果共计:33 [单选]You have been asked to select the storage system for the click-data of you ...

  2. 如何申请pmp认证考试-游峰-专题视频课程

    如何申请pmp认证考试-6046人已学习 课程介绍         你可能不知道PDU是什么?或者填写申请的时候,不知道如何编写项目经验才通过审核? 你可能知道pmp项目管理,但是对考pmp证需要花多 ...

  3. OCJP认证考试复习课-张晨光-专题视频课程

    OCJP认证考试复习课-388人已学习 课程介绍         OCJP是甲骨文推出的的国际认证. 本课程由Java总监级讲师专注Java十三年,不但包含OCJP考试全部内容,还特意制作了很有针对性 ...

  4. MOS认证考试——PPT2013(考试代码77-422)-张辛-专题视频课程

    MOS认证考试--PPT2013(考试代码77-422)-31341人已学习 课程介绍         Microsoft Office Specialist(MOS)即"微软办公软件国际认 ...

  5. SOLIDWORKS免费培训课程 CSWP认证考试机构

    微辰三维面向全国范围内的企业.团体及工程师个人提供全方面的3D CAD产品应用培训及认证考试. 微辰三维面向全国范围内的企业.团体及工程师个人提供全方面的3D CAD产品应用培训及认证考试.培训内容包 ...

  6. 《CCNA无线640-722认证考试指南》——导读

    本节书摘来自异步社区<CCNA无线640-722认证考试指南>一书中的目录,作者 [美]David Hucaby,更多章节内容可以访问云栖社区"异步社区"公众号查看 目 ...

  7. 《CCNP TSHOOT 300-135认证考试指南》——2.2节故障检测与排除及网络维护工具箱

    本节书摘来自异步社区<CCNP TSHOOT 300-135认证考试指南>一书中的第2章,第2.2节故障检测与排除及网络维护工具箱,作者 [加]Raymond Lacoste , [美]K ...

  8. [转]ISTQB FL初级认证考试资料(中文)

    [转]ISTQB FL初级认证考试资料(中文) 2015-06-22 ISTQB作为一个专业的提供软件测试认证的机构,得到了全球软件测试人员的认可.目前中国有越来越多的人已经获得或者希望获得ISTQB ...

  9. 《CCNP TSHOOT 300-135认证考试指南》——5.1节“我已经知道了吗?”测试题

    本节书摘来自异步社区<CCNP TSHOOT 300-135认证考试指南>一书中的第5章,第5.1节"我已经知道了吗?"测试题,作者 [加]Raymond Lacost ...

最新文章

  1. python【数据结构与算法】Graph(图)的总结
  2. python基本语法总结-Python基本语法总结(三) 常用内置函数
  3. Hibernate修改命名策略
  4. 《学得少考得好》读书笔记
  5. MiniProfiler,一个.NET简单但有效的微型分析器
  6. 在.NET Core中用最原生的方式读取Nacos的配置
  7. php mysql增修删_PHP mysql PDO增、删、查、改
  8. python学生管理系统(函数方法)_(python函数)学生管理系统
  9. MySQL-创建表时建立索引和在已存在表中添加索引
  10. java中class文件反编译_java中的.class文件反编译
  11. 单尺度Retinex算法学习
  12. VMware 找不到我的计算机
  13. excel使用mysql数据库查询语句_如何通过Excel查询MySQL数据库
  14. 以史为镜——台积电发展史
  15. 合肥工业大学计算机网络期中考试,计算机网络实验报告合肥工业大学
  16. POJ1608 Banal Tickets
  17. pytorch学习最全官网地址
  18. 1.2 Objective-C语言和它的后继者:Swift
  19. 面试谈薪资,别不好意思,4个技巧帮你勇敢谈薪
  20. PowerBI-时间智能函数-PARALLELPERIOD

热门文章

  1. 微信小游戏服务器数据持久化,微信小程序wx.setStorage数据缓存实现缓存过期时间...
  2. Xftp的安装与使用
  3. Fast Burst Images Denoising
  4. STM32f103ZET6引脚通道(ADC和TIM)
  5. java闲鱼支付系统_闲鱼: 多状态多操作的交易链路架构演进
  6. 阿里云产品头条(2018年2月刊)
  7. NFC天线匹配调试简介1
  8. 【basler】Chapter5:basler相机C#代码实现与详解
  9. N1文法「第1-第5」
  10. C/C++ 病毒破坏手法总结