Recommend!!
Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (258 New Questions)
Google
Exam Questions Associate-Cloud-Engineer
Google Cloud Certified - Associate Cloud Engineer
Passing Certification Exams Made Easy visit - https://www.surepassexam.com
Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (258 New Questions)
NEW QUESTION 1
You have an on-premises data analytics set of binaries that processes data files in memory for about 45 minutes every midnight. The sizes of those data files
range from 1 gigabyte to 16 gigabytes. You want to migrate this application to Google Cloud with minimal effort and cost. What should you do?
A. Upload the code to Cloud Function
B. Use Cloud Scheduler to start the application.
C. Create a container for the set of binarie
D. Use Cloud Scheduler to start a Cloud Run job for the container.
E. Create a container for the set of binaries Deploy the container to Google Kubernetes Engine (GKE) and use the Kubernetes scheduler to start the application.
F. Lift and shift to a VM on Compute Engin
G. Use an instance schedule to start and stop the instance.
Answer: B
NEW QUESTION 2
Your company has a large quantity of unstructured data in different file formats. You want to perform ETL transformations on the data. You need to make the data
accessible on Google Cloud so it can be processed by a Dataflow job. What should you do?
A. Upload the data to BigQuery using the bq command line tool.
B. Upload the data to Cloud Storage using the gsutil command line tool.
C. Upload the data into Cloud SQL using the import function in the console.
D. Upload the data into Cloud Spanner using the import function in the console.
Answer: B
Explanation:
"large quantity" : Cloud Storage or BigQuery "files" a file is nothing but an Object
NEW QUESTION 3
You have one project called proj-sa where you manage all your service accounts. You want to be able to use a service account from this project to take snapshots
of VMs running in another project called proj-vm. What should you do?
A. Download the private key from the service account, and add it to each VMs custom metadata.
B. Download the private key from the service account, and add the private key to each VM’s SSH keys.
C. Grant the service account the IAM Role of Compute Storage Admin in the project called proj-vm.
D. When creating the VMs, set the service account’s API scope for Compute Engine to read/write.
Answer: C
Explanation:
https://gtseres.medium.com/using-service-accounts-across-projects-in-gcp-cf9473fef8f0
You create the service account in proj-sa and take note of the service account email, then you go to proj-vm in IAM > ADD and add the service account's email as
new member and give it the Compute Storage Admin role.
https://cloud.google.com/compute/docs/access/iam#compute.storageAdmin
NEW QUESTION 4
You are developing a new application and are looking for a Jenkins installation to build and deploy your source code. You want to automate the installation as
quickly and easily as possible. What should you do?
A. Deploy Jenkins through the Google Cloud Marketplace.
B. Create a new Compute Engine instanc
C. Run the Jenkins executable.
D. Create a new Kubernetes Engine cluste
E. Create a deployment for the Jenkins image.
F. Create an instance template with the Jenkins executabl
G. Create a managed instance group with this template.
Answer: A
Explanation:
Installing Jenkins
In this section, you use Cloud Marketplace to provision a Jenkins instance. You customize this instance to use the agent image you created in the previous
section.
Go to the Cloud Marketplace solution for Jenkins. Click Launch on Compute Engine.
Change the Machine Type field to 4 vCPUs 15 GB Memory, n1-standard-4.
Machine type selection for Jenkins deployment.
Click Deploy and wait for your Jenkins instance to finish being provisioned. When it is finished, you will see: Jenkins has been deployed.
https://cloud.google.com/solutions/using-jenkins-for-distributed-builds-on-compute-engine#installing_jenkins
NEW QUESTION 5
Your organization has a dedicated person who creates and manages all service accounts for Google Cloud projects. You need to assign this person the minimum
role for projects. What should you do?
A. Add the user to roles/iam.roleAdmin role.
B. Add the user to roles/iam.securityAdmin role.
C. Add the user to roles/iam.serviceAccountUser role.
Passing Certification Exams Made Easy visit - https://www.surepassexam.com
Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (258 New Questions)
D. Add the user to roles/iam.serviceAccountAdmin role.
Answer: D
NEW QUESTION 6
Your company is moving its continuous integration and delivery (CI/CD) pipeline to Compute Engine instances. The pipeline will manage the entire cloud
infrastructure through code. How can you ensure that the pipeline has appropriate permissions while your system is following security best practices?
A. • Add a step for human approval to the CI/CD pipeline before the execution of the infrastructure provisioning.• Use the human approvals IAM account for the
provisioning.
B. • Attach a single service account to the compute instances.• Add minimal rights to the service account.• Allow the service account to impersonate a Cloud
Identity user with elevated permissions to create, update, or delete resources.
C. • Attach a single service account to the compute instances.• Add all required Identity and Access Management (IAM) permissions to this service account to
create, update, or delete resources
D. • Create multiple service accounts, one for each pipeline with the appropriate minimal Identity and Access Management (IAM) permissions.• Use a secret
manager service to store the key files of the service accounts.• Allow the CI/CD pipeline to request the appropriate secrets during the execution of the pipeline.
Answer: B
Explanation:
The best option is to attach a single service account to the compute instances and add minimal rights to the service account. Then, allow the service account to
impersonate a Cloud Identity user with elevated
permissions to create, update, or delete resources. This way, the service account can use short-lived access tokens to authenticate to Google Cloud APIs without
needing to manage service account keys. This option follows the principle of least privilege and reduces the risk of credential leakage and misuse.
Option A is not recommended because it requires human intervention, which can slow down the CI/CD pipeline and introduce human errors. Option C is not
secure because it grants all required IAM permissions to a single service account, which can increase the impact of a compromised key. Option D is not cost-
effective because it requires creating and managing multiple service accounts and keys, as well as using a secret manager service.
References:
1: https://cloud.google.com/iam/docs/impersonating-service-accounts
2: https://cloud.google.com/iam/docs/best-practices-for-managing-service-account-keys
3: https://cloud.google.com/iam/docs/understanding-service-accounts
NEW QUESTION 7
You have an object in a Cloud Storage bucket that you want to share with an external company. The object contains sensitive data. You want access to the
content to be removed after four hours. The external company does not have a Google account to which you can grant specific user-based access privileges. You
want to use the most secure method that requires the fewest steps. What should you do?
A. Create a signed URL with a four-hour expiration and share the URL with the company.
B. Set object access to ‘public’ and use object lifecycle management to remove the object after four hours.
C. Configure the storage bucket as a static website and furnish the object’s URL to the compan
D. Delete the object from the storage bucket after four hours.
E. Create a new Cloud Storage bucket specifically for the external company to acces
F. Copy the object to that bucke
G. Delete the bucket after four hours have passed.
Answer: A
Explanation:
Signed URLs are used to give time-limited resource access to anyone in possession of the URL, regardless of whether they have a Google account.
https://cloud.google.com/storage/docs/access-control/signed-urls
NEW QUESTION 8
You are running multiple VPC-native Google Kubernetes Engine clusters in the same subnet. The IPs available for the nodes are exhausted, and you want to
ensure that the clusters can grow in nodes when needed. What should you do?
A. Create a new subnet in the same region as the subnet being used.
B. Add an alias IP range to the subnet used by the GKE clusters.
C. Create a new VPC, and set up VPC peering with the existing VPC.
D. Expand the CIDR range of the relevant subnet for the cluster.
Answer: D
Explanation:
gcloud compute networks subnets expand-ip-range NAME gcloud compute networks subnets expand-ip-range
- expand the IP range of a Compute Engine subnetwork https://cloud.google.com/sdk/gcloud/reference/compute/networks/subnets/expand-ip-range
NEW QUESTION 9
Your auditor wants to view your organization's use of data in Google Cloud. The auditor is most interested in auditing who accessed data in Cloud Storage
buckets. You need to help the auditor access the data they need. What should you do?
A. Assign the appropriate permissions, and then use Cloud Monitoring to review metrics
B. Use the export logs API to provide the Admin Activity Audit Logs in the format they want
C. Turn on Data Access Logs for the buckets they want to audit, and Then build a query in the log viewer that filters on Cloud Storage
D. Assign the appropriate permissions, and then create a Data Studio report on Admin Activity Audit Logs
Answer: C
Explanation:
Passing Certification Exams Made Easy visit - https://www.surepassexam.com
Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (258 New Questions)
Types of audit logs Cloud Audit Logs provides the following audit logs for each Cloud project, folder, and organization: Admin Activity audit logs Data Access audit
logs System Event audit logs Policy Denied audit logs ***Data Access audit logs contain API calls that read the configuration or metadata of resources, as well as
user-driven API calls that create, modify, or read user-provided resource data. https://cloud.google.com/logging/docs/audit#types
https://cloud.google.com/logging/docs/audit#data-access Cloud Storage: When Cloud Storage usage logs are enabled, Cloud Storage writes usage data to the
Cloud Storage bucket, which generates Data Access audit logs for the bucket. The generated Data Access audit log has its caller identity redacted.
NEW QUESTION 10
You have a single binary application that you want to run on Google Cloud Platform. You decided to automatically scale the application based on underlying
infrastructure CPU usage. Your organizational policies require you to use virtual machines directly. You need to ensure that the application scaling is operationally
efficient and completed as quickly as possible. What should you do?
A. Create a Google Kubernetes Engine cluster, and use horizontal pod autoscaling to scale the application.
B. Create an instance template, and use the template in a managed instance group with autoscaling configured.
C. Create an instance template, and use the template in a managed instance group that scales up and down based on the time of day.
D. Use a set of third-party tools to build automation around scaling the application up and down, based on Stackdriver CPU usage monitoring.
Answer: B
Explanation:
Managed instance groups offer autoscaling capabilities that let you automatically add or delete instances from a managed instance group based on increases or
decreases in load (CPU Utilization in this case). Autoscaling helps your apps gracefully handle increases in traffic and reduce costs when the need for resources is
lower. You define the autoscaling policy and the autoscaler performs automatic scaling based on the measured load (CPU Utilization in this case). Autoscaling
works by adding more instances to your instance group when there is more load (upscaling), and deleting instances when the need for instances is lowered
(downscaling). Ref: https://cloud.google.com/compute/docs/autoscaler
NEW QUESTION 11
You need to enable traffic between multiple groups of Compute Engine instances that are currently running two different GCP projects. Each group of Compute
Engine instances is running in its own VPC. What should you do?
A. Verify that both projects are in a GCP Organizatio
B. Create a new VPC and add all instances.
C. Verify that both projects are in a GCP Organizatio
D. Share the VPC from one project and request that the Compute Engine instances in the other project use this shared VPC.
E. Verify that you are the Project Administrator of both project
F. Create two new VPCs and add all instances.
G. Verify that you are the Project Administrator of both project
H. Create a new VPC and add all instances.
Answer: B
Explanation:
Shared VPC allows an organization to connect resources from multiple projects to a common Virtual Private Cloud (VPC) network, so that they can communicate
with each other securely and efficiently using internal IPs from that network. When you use Shared VPC, you designate a project as a host project and attach one
or more other service projects to it. The VPC networks in the host project are called Shared VPC networks. Eligible resources from service projects can use
subnets in the Shared VPC network
https://cloud.google.com/vpc/docs/shared-vpc
"For example, an existing instance in a service project cannot be reconfigured to use a Shared VPC network, but a new instance can be created to use available
subnets in a Shared VPC network."
NEW QUESTION 12
You are the organization and billing administrator for your company. The engineering team has the Project Creator role on the organization. You do not want the
engineering team to be able to link projects to the billing account. Only the finance team should be able to link a project to a billing account, but they should not be
able to make any other changes to projects. What should you do?
A. Assign the finance team only the Billing Account User role on the billing account.
B. Assign the engineering team only the Billing Account User role on the billing account.
C. Assign the finance team the Billing Account User role on the billing account and the Project Billing Manager role on the organization.
D. Assign the engineering team the Billing Account User role on the billing account and the Project Billing Manager role on the organization.
Answer: C
Explanation:
From this source:
https://cloud.google.com/billing/docs/how-to/custom-roles#permission_association_and_inheritance
"For example, associating a project with a billing account requires the billing.resourceAssociations.create permission on the billing account and also the
resourcemanager.projects.createBillingAssignment permission on the project. This is because project permissions are required for actions where project owners
control access, while billing account permissions are required for actions where billing account administrators control access. When both should be involved, both
permissions are necessary."
NEW QUESTION 13
You need to provide a cost estimate for a Kubernetes cluster using the GCP pricing calculator for Kubernetes. Your workload requires high IOPs, and you will also
be using disk snapshots. You start by entering the number of nodes, average hours, and average days. What should you do next?
A. Fill in local SS
B. Fill in persistent disk storage and snapshot storage.
C. Fill in local SS
D. Add estimated cost for cluster management.
E. Select Add GPU
F. Fill in persistent disk storage and snapshot storage.
Passing Certification Exams Made Easy visit - https://www.surepassexam.com
Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (258 New Questions)
G. Select Add GPU
H. Add estimated cost for cluster management.
Answer: A
Explanation:
https://cloud.google.com/compute/docs/disks/local-ssd
NEW QUESTION 14
You are analyzing Google Cloud Platform service costs from three separate projects. You want to use this information to create service cost estimates by service
type, daily and monthly, for the next six months using standard query syntax. What should you do?
A. Export your bill to a Cloud Storage bucket, and then import into Cloud Bigtable for analysis.
B. Export your bill to a Cloud Storage bucket, and then import into Google Sheets for analysis.
C. Export your transactions to a local file, and perform analysis with a desktop tool.
D. Export your bill to a BigQuery dataset, and then write time window-based SQL queries for analysis.
Answer: D
Explanation:
"...we recommend that you enable Cloud Billing data export to BigQuery at the same time that you create a Cloud Billing account. "
https://cloud.google.com/billing/docs/how-to/export-data-bigquery
https://medium.com/google-cloud/analyzing-google-cloud-billing-data-with-big-query-30bae1c2aae4
NEW QUESTION 15
You need to reduce GCP service costs for a division of your company using the fewest possible steps. You need to turn off all configured services in an existing
GCP project. What should you do?
A. * 1. Verify that you are assigned the Project Owners IAM role for this project.* 2. Locate the project in the GCP console, click Shut down and then enter the
project ID.
B. * 1. Verify that you are assigned the Project Owners IAM role for this project.* 2. Switch to the project in the GCP console, locate the resources and delete them.
C. * 1. Verify that you are assigned the Organizational Administrator IAM role for this project.* 2. Locate the project in the GCP console, enter the project ID and
then click Shut down.
D. * 1. Verify that you are assigned the Organizational Administrators IAM role for this project.* 2. Switch to the project in the GCP console, locate the resources
and delete them.
Answer: A
Explanation:
https://cloud.google.com/run/docs/tutorials/gcloud https://cloud.google.com/resource-manager/docs/creating-managing-projects
https://cloud.google.com/iam/docs/understanding-roles#primitive_roles
You can shut down projects using the Cloud Console. When you shut down a project, this immediately happens: All billing and traffic serving stops, You lose
access to the project, The owners of the project will be notified and can stop the deletion within 30 days, The project will be scheduled to be deleted after 30 days.
However, some resources may be deleted much earlier.
NEW QUESTION 16
You need to grant access for three users so that they can view and edit table data on a Cloud Spanner instance. What should you do?
A. Run gcloud iam roles describe roles/spanner.databaseUse
B. Add the users to the role.
C. Run gcloud iam roles describe roles/spanner.databaseUse
D. Add the users to a new grou
E. Add the group to the role.
F. Run gcloud iam roles describe roles/spanner.viewer --project my-projec
G. Add the users to the role.
H. Run gcloud iam roles describe roles/spanner.viewer --project my-projec
I. Add the users to a new group.Add the group to the role.
Answer: B
Explanation:
https://cloud.google.com/spanner/docs/iam#spanner.databaseUser
Using the gcloud tool, execute the gcloud iam roles describe roles/spanner.databaseUser command on Cloud Shell. Attach the users to a newly created Google
group and add the group to the role.
NEW QUESTION 17
Your team maintains the infrastructure for your organization. The current infrastructure requires changes. You need to share your proposed changes with the rest
of the team. You want to follow Google’s recommended best practices. What should you do?
A. Use Deployment Manager templates to describe the proposed changes and store them in a Cloud Storage bucket.
B. Use Deployment Manager templates to describe the proposed changes and store them in Cloud Source Repositories.
C. Apply the change in a development environment, run gcloud compute instances list, and then save the output in a shared Storage bucket.
D. Apply the change in a development environment, run gcloud compute instances list, and then save the output in Cloud Source Repositories.
Answer: B
Explanation:
Showing Deployment Manager templates to your team will allow you to define the changes you want to implement in your cloud infrastructure. You can use Cloud
Passing Certification Exams Made Easy visit - https://www.surepassexam.com
Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (258 New Questions)
Source Repositories to store Deployment Manager templates and collaborate with your team. Cloud Source Repositories are fully-featured, scalable, and private
Git repositories you can use to store, manage and track changes to your code.
https://cloud.google.com/source-repositories/docs/features
NEW QUESTION 18
You have a large 5-TB AVRO file stored in a Cloud Storage bucket. Your analysts are proficient only in SQL and need access to the data stored in this file. You
want to find a cost-effective way to complete their request as soon as possible. What should you do?
A. Load data in Cloud Datastore and run a SQL query against it.
B. Create a BigQuery table and load data in BigQuer
C. Run a SQL query on this table and drop this table after you complete your request.
D. Create external tables in BigQuery that point to Cloud Storage buckets and run a SQL query on these external tables to complete your request.
E. Create a Hadoop cluster and copy the AVRO file to NDFS by compressing i
F. Load the file in a hive table and provide access to your analysts so that they can run SQL queries.
Answer: C
Explanation:
https://cloud.google.com/bigquery/external-data-sources
An external data source is a data source that you can query directly from BigQuery, even though the data is not stored in BigQuery storage.
BigQuery supports the following external data sources: Amazon S3
Azure Storage Cloud Bigtable Cloud Spanner Cloud SQL Cloud Storage
Drive
NEW QUESTION 19
You deployed a new application inside your Google Kubernetes Engine cluster using the YAML file specified below.
You check the status of the deployed pods and notice that one of them is still in PENDING status:
You want to find out why the pod is stuck in pending status. What should you do?
A. Review details of the myapp-service Service object and check for error messages.
B. Review details of the myapp-deployment Deployment object and check for error messages.
C. Review details of myapp-deployment-58ddbbb995-lp86m Pod and check for warning messages.
D. View logs of the container in myapp-deployment-58ddbbb995-lp86m pod and check for warning messages.
Answer: C
Explanation:
https://kubernetes.io/docs/tasks/debug-application-cluster/debug-application/#debugging-pods
NEW QUESTION 20
You created a Google Cloud Platform project with an App Engine application inside the project. You initially configured the application to be served from the us-
central region. Now you want the application to be served from the asia-northeast1 region. What should you do?
A. Change the default region property setting in the existing GCP project to asia-northeast1.
Passing Certification Exams Made Easy visit - https://www.surepassexam.com
Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (258 New Questions)
B. Change the region property setting in the existing App Engine application from us-central to asia-northeast1.
C. Create a second App Engine application in the existing GCP project and specify asia-northeast1 as the region to serve your application.
D. Create a new GCP project and create an App Engine application inside this new projec
E. Specify asia-northeast1 as the region to serve your application.
Answer: D
Explanation:
https://cloud.google.com/appengine/docs/flexible/managing-projects-apps-billing#:~:text=Each%20Cloud%20p Two App engine can't be running on the same
project: you can check this easy diagram for more info:
https://cloud.google.com/appengine/docs/standard/an-overview-of-app-engine#components_of_an_application
And you can't change location after setting it for your app Engine. https://cloud.google.com/appengine/docs/standard/locations
App Engine is regional and you cannot change an apps region after you set it. Therefore, the only way to have an app run in another region is by creating a new
project and targeting the app engine to run in the required region (asia-northeast1 in our case).
Ref: https://cloud.google.com/appengine/docs/locations
NEW QUESTION 21
You have deployed an application on a single Compute Engine instance. The application writes logs to disk. Users start reporting errors with the application. You
want to diagnose the problem. What should you do?
A. Navigate to Cloud Logging and view the application logs.
B. Connect to the instance’s serial console and read the application logs.
C. Configure a Health Check on the instance and set a Low Healthy Threshold value.
D. Install and configure the Cloud Logging Agent and view the logs from Cloud Logging.
Answer: D
NEW QUESTION 22
You are working for a hospital that stores Its medical images in an on-premises data room. The hospital wants to use Cloud Storage for archival storage of these
images. The hospital wants an automated process to upload any new medical images to Cloud Storage. You need to design and implement a solution. What
should you do?
A. Deploy a Dataflow job from the batch template "Datastore lo Cloud Storage" Schedule the batch job on the desired interval
B. In the Cloud Console, go to Cloud Storage Upload the relevant images to the appropriate bucket
C. Create a script that uses the gsutil command line interface to synchronize the on-premises storage with Cloud Storage Schedule the script as a cron job
D. Create a Pub/Sub topic, and enable a Cloud Storage trigger for the Pub/Sub topi
E. Create an application that sends all medical images to the Pub/Sub lope
Answer: C
Explanation:
they require cloud storage for archival and the want to automate the process to upload new medical image to cloud storage, hence we go for gsutil to copy on-
prem images to cloud storage and automate the process via cron job. whereas Pub/Sub listens to the changes in the Cloud Storage bucket and triggers the
pub/sub topic, which is not required.
NEW QUESTION 23
You need to manage multiple Google Cloud Platform (GCP) projects in the fewest steps possible. You want to configure the Google Cloud SDK command line
interface (CLI) so that you can easily manage multiple GCP projects. What should you?
A. * 1. Create a configuration for each project you need to manage.* 2. Activate the appropriate configuration when you work with each of your assigned GCP
projects.
B. * 1. Create a configuration for each project you need to manage.* 2. Use gcloud init to update the configuration values when you need to work with a non-default
project
C. * 1. Use the default configuration for one project you need to manage.* 2. Activate the appropriate configuration when you work with each of your assigned GCP
projects.
D. * 1. Use the default configuration for one project you need to manage.* 2. Use gcloud init to update the configuration values when you need to work with a non-
default project.
Answer: A
Explanation:
https://cloud.google.com/sdk/gcloud https://cloud.google.com/sdk/docs/configurations#multiple_configurations
NEW QUESTION 24
You need to produce a list of the enabled Google Cloud Platform APIs for a GCP project using the gcloud command line in the Cloud Shell. The project name is
my-project. What should you do?
A. Run gcloud projects list to get the project ID, and then run gcloud services list --project <project ID>.
B. Run gcloud init to set the current project to my-project, and then run gcloud services list --available.
C. Run gcloud info to view the account value, and then run gcloud services list --account <Account>.
D. Run gcloud projects describe <project ID> to verify the project value, and then run gcloud services list--available.
Answer: A
Explanation:
`gcloud services list --available` returns not only the enabled services in the project but also services that CAN be enabled.
https://cloud.google.com/sdk/gcloud/reference/services/list#--available
Run the following command to list the enabled APIs and services in your current project: gcloud services list
whereas, Run the following command to list the APIs and services available to you in your current project: gcloud services list –available
Passing Certification Exams Made Easy visit - https://www.surepassexam.com
Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (258 New Questions)
https://cloud.google.com/sdk/gcloud/reference/services/list#--available
--available
Return the services available to the project to enable. This list will include any services that the project has already enabled.
To list the services the current project has enabled for consumption, run: gcloud services list --enabled
To list the services the current project can enable for consumption, run: gcloud services list –available
NEW QUESTION 25
Your organization needs to grant users access to query datasets in BigQuery but prevent them from accidentally deleting the datasets. You want a solution that
follows Google-recommended practices. What should you do?
A. Add users to roles/bigquery user role only, instead of roles/bigquery dataOwner.
B. Add users to roles/bigquery dataEditor role only, instead of roles/bigquery dataOwner.
C. Create a custom role by removing delete permissions, and add users to that role only.
D. Create a custom role by removing delete permission
E. Add users to the group, and then add the group to the custom role.
Answer: D
Explanation:
https://cloud.google.com/bigquery/docs/access-control#custom_roles
Custom roles enable you to enforce the principle of least privilege, ensuring that the user and service accounts in your organization have only the permissions
essential to performing their intended functions.
NEW QUESTION 26
Your company has a Google Cloud Platform project that uses BigQuery for data warehousing. Your data science team changes frequently and has few members.
You need to allow members of this team to perform queries. You want to follow Google-recommended practices. What should you do?
A. 1. Create an IAM entry for each data scientist's user account.2. Assign the BigQuery jobUser role to the group.
B. 1. Create an IAM entry for each data scientist's user account.2. Assign the BigQuery dataViewer user role to the group.
C. 1. Create a dedicated Google group in Cloud Identity.2. Add each data scientist's user account to the group.3. Assign the BigQuery jobUser role to the group.
D. 1. Create a dedicated Google group in Cloud Identity.2. Add each data scientist's user account to the group.3. Assign the BigQuery dataViewer user role to the
group.
Answer: C
Explanation:
Read the dataset's metadata and to list tables in the dataset. Read data and metadata from the dataset's tables. When applied at the project or organization level,
this role can also enumerate all datasets in the project. Additional roles, however, are necessary to allow the running of jobs.
BigQuery Data Viewer (roles/bigquery.dataViewer)
When applied to a table or view, this role provides permissions to: Read data and metadata from the table or view.
This role cannot be applied to individual models or routines. When applied to a dataset, this role provides permissions to: Read the dataset's metadata and list
tables in the dataset. Read data and metadata from the dataset's tables.
When applied at the project or organization level, this role can also enumerate all datasets in the project. Additional roles, however, are necessary to allow the
running of jobs.
Lowest-level resources where you can grant this role: Table
View
BigQuery Job User (roles/bigquery.jobUser)
Provides permissions to run jobs, including queries, within the project.
Lowest-level resources where you can grant this role:
Project
to run jobs https://cloud.google.com/bigquery/docs/access-control#bigquery.jobUser databaseUser needs additional role permission to run jobs
https://cloud.google.com/spanner/docs/iam#spanner.databaseUser
NEW QUESTION 27
......
Passing Certification Exams Made Easy visit - https://www.surepassexam.com
Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (258 New Questions)
Thank You for Trying Our Product
We offer two products:
1st - We have Practice Tests Software with Actual Exam Questions
2nd - Questons and Answers in PDF Format
Associate-Cloud-Engineer Practice Exam Features:
* Associate-Cloud-Engineer Questions and Answers Updated Frequently
* Associate-Cloud-Engineer Practice Questions Verified by Expert Senior Certified Staff
* Associate-Cloud-Engineer Most Realistic Questions that Guarantee you a Pass on Your FirstTry
* Associate-Cloud-Engineer Practice Test Questions in Multiple Choice Formats and Updatesfor 1 Year
100% Actual & Verified — Instant Download, Please Click
Order The Associate-Cloud-Engineer Practice Test Here
Passing Certification Exams Made Easy visit - https://www.surepassexam.com
Powered by TCPDF (www.tcpdf.org)