0% found this document useful (0 votes)
108 views15 pages

Associate Cloud Engineer 6

The document provides a series of questions and answers related to the Google Cloud Certified - Associate Cloud Engineer exam, including topics such as billing configuration, service account management, data transfer methods, and application deployment. It emphasizes best practices for managing resources and permissions within Google Cloud. Additionally, it includes links to resources for obtaining exam dumps and preparation materials.

Uploaded by

Louis Junior
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
108 views15 pages

Associate Cloud Engineer 6

The document provides a series of questions and answers related to the Google Cloud Certified - Associate Cloud Engineer exam, including topics such as billing configuration, service account management, data transfer methods, and application deployment. It emphasizes best practices for managing resources and permissions within Google Cloud. Additionally, it includes links to resources for obtaining exam dumps and preparation materials.

Uploaded by

Louis Junior
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Recommend!!

Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

Google
Exam Questions Associate-Cloud-Engineer
Google Cloud Certified - Associate Cloud Engineer

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

NEW QUESTION 1
Your company has multiple projects linked to a single billing account in Google Cloud. You need to visualize the costs with specific metrics that should be
dynamically calculated based on company-specific criteria. You want to automate the process. What should you do?

A. In the Google Cloud console, visualize the costs related to the projects in the Reports section.
B. In the Google Cloud console, visualize the costs related to the projects in the Cost breakdown section.
C. In the Google Cloud console, use the export functionality of the Cost tabl
D. Create a Looker Studiodashboard on top of the CSV export.
E. Configure Cloud Billing data export to BigOuery for the billing accoun
F. Create a Looker Studio dashboard on top of the BigOuery export.

Answer: D

NEW QUESTION 2
Your organization has a dedicated person who creates and manages all service accounts for Google Cloud projects. You need to assign this person the minimum
role for projects. What should you do?

A. Add the user to roles/iam.roleAdmin role.


B. Add the user to roles/iam.securityAdmin role.
C. Add the user to roles/iam.serviceAccountUser role.
D. Add the user to roles/iam.serviceAccountAdmin role.

Answer: D

NEW QUESTION 3
You have been asked to set up the billing configuration for a new Google Cloud customer. Your customer wants to group resources that share common IAM
policies. What should you do?

A. Use labels to group resources that share common IAM policies


B. Use folders to group resources that share common IAM policies
C. Set up a proper billing account structure to group IAM policies
D. Set up a proper project naming structure to group IAM policies

Answer: B

Explanation:
Folders are nodes in the Cloud Platform Resource Hierarchy. A folder can contain projects, other folders, or a combination of both. Organizations can use folders
to group projects under the organization node in a hierarchy. For example, your organization might contain multiple departments, each with its own set of Google
Cloud resources. Folders allow you to group these resources on a per-department basis. Folders are used to group resources that share common IAM policies.
While a folder can contain multiple folders or resources, a given folder or resource can have exactly one parent.
https://cloud.google.com/resource-manager/docs/creating-managing-folders

NEW QUESTION 4
You have 32 GB of data in a single file that you need to upload to a Nearline Storage bucket. The WAN connection you are using is rated at 1 Gbps, and you are
the only one on the connection. You want to use as much of the rated 1 Gbps as possible to transfer the file rapidly. How should you upload the file?

A. Use the GCP Console to transfer the file instead of gsutil.


B. Enable parallel composite uploads using gsutil on the file transfer.
C. Decrease the TCP window size on the machine initiating the transfer.
D. Change the storage class of the bucket from Nearline to Multi-Regional.

Answer: B

Explanation:
https://cloud.google.com/storage/docs/parallel-composite-uploads https://cloud.google.com/storage/docs/uploads-downloads#parallel-composite-uploads

NEW QUESTION 5
Your company's security vulnerability management policy wonts 3 member of the security team to have visibility into vulnerabilities and other OS metadata for a
specific Compute Engine instance This Compute Engine instance hosts a critical application in your Goggle Cloud project. You need to implement your company's
security vulnerability management policy. What should you dc?

A. • Ensure that the Ops Agent Is Installed on the Compute Engine instance.• Create a custom metric in the Cloud Monitoring dashboard.• Provide the security
team member with access to this dashboard.
B. • Ensure that the Ops Agent is installed on tie Compute Engine instance.• Provide the security team member roles/configure.inventoryViewer permission.
C. • Ensure that the OS Config agent Is Installed on the Compute Engine instance.• Provide the security team member roles/configure.vulnerabilityViewer
permission.
D. • Ensure that the OS Config agent is installed on the Compute Engine instance• Create a log sink Co a BigQuery dataset.• Provide the security team member
with access to this dataset.

Answer: C

NEW QUESTION 6
You need to manage a third-party application that will run on a Compute Engine instance. Other Compute Engine instances are already running with default
configuration. Application installation files are hosted on Cloud Storage. You need to access these files from the new instance without allowing other virtual
machines (VMs) to access these files. What should you do?

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

A. Create the instance with the default Compute Engine service account Grant the service account permissions on Cloud Storage.
B. Create the instance with the default Compute Engine service account Add metadata to the objects on Cloud Storage that matches the metadata on the new
instance.
C. Create a new service account and assig n this service account to the new instance Grant the service account permissions on Cloud Storage.
D. Create a new service account and assign this service account to the new instance Add metadata to the objects on Cloud Storage that matches the metadata on
the new instance.

Answer: B

Explanation:
https://cloud.google.com/iam/docs/best-practices-for-using-and-managing-service-accounts
If an application uses third-party or custom identities and needs to access a resource, such as a BigQuery dataset or a Cloud Storage bucket, it must perform a
transition between principals. Because Google Cloud APIs don't recognize third-party or custom identities, the application can't propagate the end-user's identity to
BigQuery or Cloud Storage. Instead, the application has to perform the access by using a different Google identity.

NEW QUESTION 7
You are building a data lake on Google Cloud for your Internet of Things (loT) application. The loT application has millions of sensors that are constantly streaming
structured and unstructured data to your backend in the cloud. You want to build a highly available and resilient architecture based on
Google-recommended practices. What should you do?

A. Stream data to Pub/Sub, and use Dataflow to send data to Cloud Storage
B. Stream data to Pub/Su
C. and use Storage Transfer Service to send data to BigQuery.
D. Stream data to Dataflow, and use Storage Transfer Service to send data to BigQuery.
E. Stream data to Dataflow, and use Dataprep by Trifacta to send data to Bigtable.

Answer: B

NEW QUESTION 8
You have an object in a Cloud Storage bucket that you want to share with an external company. The object contains sensitive data. You want access to the
content to be removed after four hours. The external company does not have a Google account to which you can grant specific user-based access privileges. You
want to use the most secure method that requires the fewest steps. What should you do?

A. Create a signed URL with a four-hour expiration and share the URL with the company.
B. Set object access to ‘public’ and use object lifecycle management to remove the object after four hours.
C. Configure the storage bucket as a static website and furnish the object’s URL to the compan
D. Delete the object from the storage bucket after four hours.
E. Create a new Cloud Storage bucket specifically for the external company to acces
F. Copy the object to that bucke
G. Delete the bucket after four hours have passed.

Answer: A

Explanation:
Signed URLs are used to give time-limited resource access to anyone in possession of the URL, regardless of whether they have a Google account.
https://cloud.google.com/storage/docs/access-control/signed-urls

NEW QUESTION 9
You have created an application that is packaged into a Docker image. You want to deploy the Docker image as a workload on Google Kubernetes Engine. What
should you do?

A. Upload the image to Cloud Storage and create a Kubernetes Service referencing the image.
B. Upload the image to Cloud Storage and create a Kubernetes Deployment referencing the image.
C. Upload the image to Container Registry and create a Kubernetes Service referencing the image.
D. Upload the image to Container Registry and create a Kubernetes Deployment referencing the image.

Answer: D

Explanation:
A deployment is responsible for keeping a set of pods running. A service is responsible for enabling network access to a set of pods.

NEW QUESTION 10
You need to set up permissions for a set of Compute Engine instances to enable them to write data into a particular Cloud Storage bucket. You want to follow
Google-recommended practices. What should you do?

A. Create a service account with an access scop


B. Use the access scope ‘https://www.googleapis.com/auth/devstorage.write_only’.
C. Create a service account with an access scop
D. Use the access scope ‘https://www.googleapis.com/auth/cloud-platform’.
E. Create a service account and add it to the IAM role ‘storage.objectCreator’ for that bucket.
F. Create a service account and add it to the IAM role ‘storage.objectAdmin’ for that bucket.

Answer: C

Explanation:
https://cloud.google.com/iam/docs/understanding-service-accounts#using_service_accounts_with_compute_eng https://cloud.google.com/storage/docs/access-
control/iam-roles

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

NEW QUESTION 10
You are deploying an application to App Engine. You want the number of instances to scale based on request rate. You need at least 3 unoccupied instances at all
times. Which scaling type should you use?

A. Manual Scaling with 3 instances.


B. Basic Scaling with min_instances set to 3.
C. Basic Scaling with max_instances set to 3.
D. Automatic Scaling with min_idle_instances set to 3.

Answer: D

NEW QUESTION 15
You need to manage a Cloud Spanner Instance for best query performance. Your instance in production runs in a single Google Cloud region. You need to
improve performance in the shortest amount of time. You want to follow Google best practices for service configuration. What should you do?

A. Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 45% If you exceed this threshold, add nodes lo your
instance.
B. Create an alert in Cloud Monitoring to alert when the percentage to high priority CPU utilization reaches 45% Use database query statistics to identify queries
that result in high CPU usage, and then rewrite those queries to optimize their resource usage
C. Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 65% If you exceed this threshold, add nodes to your
instance
D. Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 65%. Use database query statistics to identity queries
that result in high CPU usage, and then rewrite those queries to optimize their resource usage.

Answer: B

Explanation:
https://cloud.google.com/spanner/docs/cpu-utilization#recommended-max

NEW QUESTION 17
You are building a pipeline to process time-series data. Which Google Cloud Platform services should you put in boxes 1,2,3, and 4?

A. Cloud Pub/Sub, Cloud Dataflow, Cloud Datastore, BigQuery


B. Firebase Messages, Cloud Pub/Sub, Cloud Spanner, BigQuery
C. Cloud Pub/Sub, Cloud Storage, BigQuery, Cloud Bigtable
D. Cloud Pub/Sub, Cloud Dataflow, Cloud Bigtable, BigQuery

Answer: D

NEW QUESTION 22
You are in charge of provisioning access for all Google Cloud users in your organization. Your company recently acquired a startup company that has their own
Google Cloud organization. You need to ensure that your Site Reliability Engineers (SREs) have the same project permissions in the startup company's
organization as in your own organization. What should you do?

A. In the Google Cloud console for your organization, select Create role from selection, and choose destination as the startup company's organization
B. In the Google Cloud console for the startup company, select Create role from selection and choose source as the startup company's Google Cloud organization.
C. Use the gcloud iam roles copy command, and provide the Organization ID of the startup company's Google Cloud Organization as the destination.
D. Use the gcloud iam roles copy command, and provide the project IDs of all projects in the startup company s organization as the destination.

Answer: C

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

Explanation:
https://cloud.google.com/architecture/best-practices-vpc-design#shared-service Cloud VPN is another alternative. Because Cloud VPN establishes reachability
through managed IPsec tunnels, it doesn't have the aggregate limits of VPC Network Peering. Cloud VPN uses a VPN Gateway for connectivity and doesn't
consider the aggregate resource use of the IPsec peer. The drawbacks of Cloud VPN include increased costs (VPN tunnels and traffic egress), management
overhead required to maintain tunnels, and the performance overhead of IPsec.

NEW QUESTION 24
You create a new Google Kubernetes Engine (GKE) cluster and want to make sure that it always runs a supported and stable version of Kubernetes. What should
you do?

A. Enable the Node Auto-Repair feature for your GKE cluster.


B. Enable the Node Auto-Upgrades feature for your GKE cluster.
C. Select the latest available cluster version for your GKE cluster.
D. Select “Container-Optimized OS (cos)” as a node image for your GKE cluster.

Answer: B

Explanation:
Creating or upgrading a cluster by specifying the version as latest does not provide automatic upgrades. Enable node auto-upgrades to ensure that the nodes in
your cluster are up-to-date with the latest stable version.
https://cloud.google.com/kubernetes-engine/versioning-and-upgrades
Node auto-upgrades help you keep the nodes in your cluster up to date with the cluster master version when your master is updated on your behalf. When you
create a new cluster or node pool with Google Cloud Console or the gcloud command, node auto-upgrade is enabled by default.
Ref: https://cloud.google.com/kubernetes-engine/docs/how-to/node-auto-upgrades

NEW QUESTION 27
Your managed instance group raised an alert stating that new instance creation has failed to create new instances. You need to maintain the number of running
instances specified by the template to be able to process expected application traffic. What should you do?

A. Create an instance template that contains valid syntax which will be used by the instance grou
B. Delete any persistent disks with the same name as instance names.
C. Create an instance template that contains valid syntax that will be used by the instance grou
D. Verify that the instance name and persistent disk name values are not the same in the template.
E. Verify that the instance template being used by the instance group contains valid synta
F. Delete any persistent disks with the same name as instance name
G. Set the disks.autoDelete property to true in the instance template.
H. Delete the current instance template and replace it with a new instance templat
I. Verify that the instance name and persistent disk name values are not the same in the templat
J. Set the disks.autoDelete property to true in the instance template.

Answer: A

Explanation:
https://cloud.google.com/compute/docs/troubleshooting/troubleshooting-migs https://cloud.google.com/compute/docs/instance-
templates#how_to_update_instance_templates

NEW QUESTION 32
You are building an application that processes data files uploaded from thousands of suppliers. Your primary goals for the application are data security and the
expiration of aged data. You need to design the application to:
•Restrict access so that suppliers can access only their own data.
•Give suppliers write access to data only for 30 minutes.
•Delete data that is over 45 days old.
You have a very short development cycle, and you need to make sure that the application requires minimal maintenance. Which two strategies should you use?
(Choose two.)

A. Build a lifecycle policy to delete Cloud Storage objects after 45 days.


B. Use signed URLs to allow suppliers limited time access to store their objects.
C. Set up an SFTP server for your application, and create a separate user for each supplier.
D. Build a Cloud function that triggers a timer of 45 days to delete objects that have expired.
E. Develop a script that loops through all Cloud Storage buckets and deletes any buckets that are older than 45 days.

Answer: AB

Explanation:
(A) Object Lifecycle Management Delete
The Delete action deletes an object when the object meets all conditions specified in the lifecycle rule.
Exception: In buckets with Object Versioning enabled, deleting the live version of an object causes it to become a noncurrent version, while deleting a noncurrent
version deletes that version permanently.
https://cloud.google.com/storage/docs/lifecycle#delete
(B) Signed URLs
This page provides an overview of signed URLs, which you use to give time-limited resource access to anyone in possession of the URL, regardless of whether
they have a Google account
https://cloud.google.com/storage/docs/access-control/signed-urls

NEW QUESTION 37
You have an application that runs on Compute Engine VM instances in a custom Virtual Private Cloud (VPC). Your company's security policies only allow the use
to internal IP addresses on VM instances and do not let VM instances connect to the internet. You need to ensure that the application can access a file hosted in a
Cloud Storage bucket within your project. What should you do?

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

A. Enable Private Service Access on the Cloud Storage Bucket.


B. Add slorage.googleapis.com to the list of restricted services in a VPC Service Controls perimeter and add your project to the list to protected projects.
C. Enable Private Google Access on the subnet within the custom VPC.
D. Deploy a Cloud NAT instance and route the traffic to the dedicated IP address of the Cloud Storage bucket.

Answer: A

NEW QUESTION 42
You are assigned to maintain a Google Kubernetes Engine (GKE) cluster named dev that was deployed on Google Cloud. You want to manage the GKE
configuration using the command line interface (CLI). You have just downloaded and installed the Cloud SDK. You want to ensure that future CLI commands by
default address this specific cluster. What should you do?

A. Use the command gcloud config set container/cluster dev.


B. Use the command gcloud container clusters update dev.
C. Create a file called gke.default in the ~/.gcloud folder that contains the cluster name.
D. Create a file called defaults.json in the ~/.gcloud folder that contains the cluster name.

Answer: A

Explanation:
To set a default cluster for gcloud commands, run the following command: gcloud config set container/cluster CLUSTER_NAME
https://cloud.google.com/kubernetes-engine/docs/how-to/managing-clusters?hl=en

NEW QUESTION 43
You have a Google Cloud Platform account with access to both production and development projects. You need to create an automated process to list all compute
instances in development and production projects on a daily basis. What should you do?

A. Create two configurations using gcloud confi


B. Write a script that sets configurations as active, individuall
C. For each configuration, use gcloud compute instances list to get a list of compute resources.
D. Create two configurations using gsutil confi
E. Write a script that sets configurations as active, individuall
F. For each configuration, use gsutil compute instances list to get a list of compute resources.
G. Go to Cloud Shell and export this information to Cloud Storage on a daily basis.
H. Go to GCP Console and export this information to Cloud SQL on a daily basis.

Answer: A

Explanation:
You can create two configurations – one for the development project and another for the production project. And you do that by running “gcloud config
configurations create” command.https://cloud.google.com/sdk/gcloud/reference/config/configurations/createIn your custom script, you can load these
configurations one at a time and execute gcloud compute instances list to list Google Compute Engine instances in the project that is active in the gcloud
configuration.Ref: https://cloud.google.com/sdk/gcloud/reference/compute/instances/listOnce you have this information, you can export it in a suitable format to a
suitable target e.g. export as CSV or export to Cloud Storage/BigQuery/SQL, etc

NEW QUESTION 45
You want to run a single caching HTTP reverse proxy on GCP for a latency-sensitive website. This specific reverse proxy consumes almost no CPU. You want to
have a 30-GB in-memory cache, and need an additional 2 GB of memory for the rest of the processes. You want to minimize cost. How should you run this reverse
proxy?

A. Create a Cloud Memorystore for Redis instance with 32-GB capacity.


B. Run it on Compute Engine, and choose a custom instance type with 6 vCPUs and 32 GB of memory.
C. Package it in a container image, and run it on Kubernetes Engine, using n1-standard-32 instances as nodes.
D. Run it on Compute Engine, choose the instance type n1-standard-1, and add an SSD persistent disk of 32 GB.

Answer: A

Explanation:
What is Google Cloud Memorystore?
Overview. Cloud Memorystore for Redis is a fully managed Redis service for Google Cloud Platform. Applications running on Google Cloud Platform can achieve
extreme performance by leveraging the highly scalable, highly available, and secure Redis service without the burden of managing complex Redis deployments.

NEW QUESTION 46
You are building an archival solution for your data warehouse and have selected Cloud Storage to archive your data. Your users need to be able to access this
archived data once a quarter for some regulatory requirements. You want to select a cost-efficient option. Which storage option should you use?

A. Coldline Storage
B. Nearline Storage
C. Regional Storage
D. Multi-Regional Storage

Answer: A

Explanation:
Coldline Storage is a very-low-cost, highly durable storage service for storing infrequently accessed data. Coldline Storage is ideal for data you plan to read or
modify at most once a quarter. Since we have a requirement to access data once a quarter and want to go with the most cost-efficient option, we should select
Coldline Storage.
Ref: https://cloud.google.com/storage/docs/storage-classes#coldline

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

NEW QUESTION 50
You manage an App Engine Service that aggregates and visualizes data from BigQuery. The application is deployed with the default App Engine Service account.
The data that needs to be visualized resides in a different project managed by another team. You do not have access to this project, but you want your application
to be able to read data from the BigQuery dataset. What should you do?

A. Ask the other team to grant your default App Engine Service account the role of BigQuery Job User.
B. Ask the other team to grant your default App Engine Service account the role of BigQuery Data Viewer.
C. In Cloud IAM of your project, ensure that the default App Engine service account has the role of BigQuery Data Viewer.
D. In Cloud IAM of your project, grant a newly created service account from the other team the role of BigQuery Job User in your project.

Answer: B

Explanation:
The resource that you need to get access is in the other project. roles/bigquery.dataViewer BigQuery Data Viewer
When applied to a table or view, this role provides permissions to: Read data and metadata from the table or view.
This role cannot be applied to individual models or routines. When applied to a dataset, this role provides permissions to:
Read the dataset's metadata and list tables in the dataset. Read data and metadata from the dataset's tables.
When applied at the project or organization level, this role can also enumerate all datasets in the project. Additional roles, however, are necessary to allow the
running of jobs.

NEW QUESTION 55
For analysis purposes, you need to send all the logs from all of your Compute Engine instances to a BigQuery dataset called platform-logs. You have already
installed the Stackdriver Logging agent on all the instances. You want to minimize cost. What should you do?

A. 1. Give the BigQuery Data Editor role on the platform-logs dataset to the service accounts used by your instances.2. Update your instances’ metadata to add
the following value: logs-destination:bq://platform-logs.
B. 1. In Stackdriver Logging, create a logs export with a Cloud Pub/Sub topic called logs as a sink.2.Create a Cloud Function that is triggered by messages in the
logs topic.3. Configure that Cloud Function to drop logs that are not from Compute Engine and to insert Compute Engine logs in the platform-logs dataset.
C. 1. In Stackdriver Logging, create a filter to view only Compute Engine logs.2. Click Create Export.3.Choose BigQuery as Sink Service, and the platform-logs
dataset as Sink Destination.
D. 1. Create a Cloud Function that has the BigQuery User role on the platform-logs dataset.2. Configure this Cloud Function to create a BigQuery Job that
executes this query:INSERT INTOdataset.platform-logs (timestamp, log)SELECT timestamp, log FROM compute.logsWHERE timestamp>
DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)3. Use Cloud Scheduler to trigger this Cloud Function once a day.

Answer: C

Explanation:
* 1. In Stackdriver Logging, create a filter to view only Compute Engine logs. 2. Click Create Export. 3. Choose BigQuery as Sink Service, and the platform-logs
dataset as Sink Destination.

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

NEW QUESTION 56
You need to reduce GCP service costs for a division of your company using the fewest possible steps. You need to turn off all configured services in an existing
GCP project. What should you do?

A. * 1. Verify that you are assigned the Project Owners IAM role for this project.* 2. Locate the project in the GCP console, click Shut down and then enter the
project ID.
B. * 1. Verify that you are assigned the Project Owners IAM role for this project.* 2. Switch to the project in the GCP console, locate the resources and delete them.
C. * 1. Verify that you are assigned the Organizational Administrator IAM role for this project.* 2. Locate the project in the GCP console, enter the project ID and
then click Shut down.
D. * 1. Verify that you are assigned the Organizational Administrators IAM role for this project.* 2. Switch to the project in the GCP console, locate the resources
and delete them.

Answer: A

Explanation:
https://cloud.google.com/run/docs/tutorials/gcloud https://cloud.google.com/resource-manager/docs/creating-managing-projects
https://cloud.google.com/iam/docs/understanding-roles#primitive_roles
You can shut down projects using the Cloud Console. When you shut down a project, this immediately happens: All billing and traffic serving stops, You lose
access to the project, The owners of the project will be notified and can stop the deletion within 30 days, The project will be scheduled to be deleted after 30 days.
However, some resources may be deleted much earlier.

NEW QUESTION 60
You are developing a financial trading application that will be used globally. Data is stored and queried using a relational structure, and clients from all over the
world should get the exact identical state of the data. The application will be deployed in multiple regions to provide the lowest latency to end users. You need to
select a storage option for the application data while minimizing latency. What should you do?

A. Use Cloud Bigtable for data storage.


B. Use Cloud SQL for data storage.
C. Use Cloud Spanner for data storage.
D. Use Firestore for data storage.

Answer: C

Explanation:
Keywords, Financial data (large data) used globally, data stored and queried using relational structure (SQL), clients should get exact identical copies(Strong
Consistency), Multiple region, low latency to end user, select storage option to minimize latency.

NEW QUESTION 64
You need to grant access for three users so that they can view and edit table data on a Cloud Spanner instance. What should you do?

A. Run gcloud iam roles describe roles/spanner.databaseUse


B. Add the users to the role.
C. Run gcloud iam roles describe roles/spanner.databaseUse
D. Add the users to a new grou
E. Add the group to the role.
F. Run gcloud iam roles describe roles/spanner.viewer --project my-projec
G. Add the users to the role.
H. Run gcloud iam roles describe roles/spanner.viewer --project my-projec
I. Add the users to a new group.Add the group to the role.

Answer: B

Explanation:
https://cloud.google.com/spanner/docs/iam#spanner.databaseUser
Using the gcloud tool, execute the gcloud iam roles describe roles/spanner.databaseUser command on Cloud Shell. Attach the users to a newly created Google
group and add the group to the role.

NEW QUESTION 69
You have successfully created a development environment in a project for an application. This application uses Compute Engine and Cloud SQL. Now, you need
to create a production environment for this application.
The security team has forbidden the existence of network routes between these 2 environments, and asks you to follow Google-recommended practices. What
should you do?

A. Create a new project, enable the Compute Engine and Cloud SQL APIs in that project, and replicate the setup you have created in the development
environment.
B. Create a new production subnet in the existing VPC and a new production Cloud SQL instance in your existing project, and deploy your application using those
resources.
C. Create a new project, modify your existing VPC to be a Shared VPC, share that VPC with your new project, and replicate the setup you have in the
development environment in that new project, in the Shared VPC.
D. Ask the security team to grant you the Project Editor role in an existing production project used by another division of your compan
E. Once they grant you that role, replicate the setup you have in the development environment in that project.

Answer: A

Explanation:
This aligns with Googles recommended practices. By creating a new project, we achieve complete isolation between development and production environments;
as well as isolate this production application from production applications of other departments.
Ref: https://cloud.google.com/docs/enterprise/best-practices-for-enterprise-organizations#define-hierarchy

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

NEW QUESTION 74
You want to find out when users were added to Cloud Spanner Identity Access Management (IAM) roles on your Google Cloud Platform (GCP) project. What
should you do in the GCP Console?

A. Open the Cloud Spanner console to review configurations.


B. Open the IAM & admin console to review IAM policies for Cloud Spanner roles.
C. Go to the Stackdriver Monitoring console and review information for Cloud Spanner.
D. Go to the Stackdriver Logging console, review admin activity logs, and filter them for Cloud Spanner IAM roles.

Answer: D

Explanation:
https://cloud.google.com/monitoring/audit-logging

NEW QUESTION 79
You are building a multi-player gaming application that will store game information in a database. As the popularity of the application increases, you are concerned
about delivering consistent performance. You need to ensure an optimal gaming performance for global users, without increasing the management complexity.
What should you do?

A. Use Cloud SQL database with cross-region replication to store game statistics in the EU, US, and APAC regions.
B. Use Cloud Spanner to store user data mapped to the game statistics.
C. Use BigQuery to store game statistics with a Redis on Memorystore instance in the front to provide global consistency.
D. Store game statistics in a Bigtable database partitioned by username.

Answer: B

NEW QUESTION 81
You are working in a team that has developed a new application that needs to be deployed on Kubernetes. The production application is business critical and
should be optimized for reliability. You need to provision a Kubernetes cluster and want to follow Google-recommended practices. What should you do?

A. Create a GKE Autopilot cluste


B. Enroll the cluster in the rapid release channel.
C. Create a GKE Autopilot cluste
D. Enroll the cluster in the stable release channel.
E. Create a zonal GKE standard cluste
F. Enroll the cluster in the stable release channel.
G. Create a regional GKE standard cluste
H. Enroll the cluster in the rapid release channel.

Answer: B

Explanation:
Autopilot is more reliable and stable release gives more time to fix issues in new version of GKE

NEW QUESTION 86
You have one GCP account running in your default region and zone and another account running in a
non-default region and zone. You want to start a new Compute Engine instance in these two Google Cloud Platform accounts using the command line interface.
What should you do?

A. Create two configurations using gcloud config configurations create [NAME]. Run gcloud config configurations activate [NAME] to switch between accounts
when running the commands to start the Compute Engine instances.
B. Create two configurations using gcloud config configurations create [NAME]. Run gcloud configurations list to start the Compute Engine instances.
C. Activate two configurations using gcloud configurations activate [NAME]. Run gcloud config list to start the Compute Engine instances.
D. Activate two configurations using gcloud configurations activate [NAME]. Run gcloud configurations list to start the Compute Engine instances.

Answer: A

Explanation:
"Run gcloud configurations list to start the Compute Engine instances". How the heck are you expecting to "start" GCE instances doing "configuration list".
Each gcloud configuration has a 1 to 1 relationship with the region (if a region is defined). Since we have two different regions, we would need to create two
separate configurations using gcloud config configurations createRef: https://cloud.google.com/sdk/gcloud/reference/config/configurations/create
Secondly, you can activate each configuration independently by running gcloud config configurations activate [NAME]Ref:
https://cloud.google.com/sdk/gcloud/reference/config/configurations/activate
Finally, while each configuration is active, you can run the gcloud compute instances start [NAME] command to start the instance in the configurations
region.https://cloud.google.com/sdk/gcloud/reference/compute/instances/start

NEW QUESTION 91
You created an instance of SQL Server 2017 on Compute Engine to test features in the new version. You want to connect to this instance using the fewest number
of steps. What should you do?

A. Install a RDP client on your deskto


B. Verify that a firewall rule for port 3389 exists.
C. Install a RDP client in your deskto
D. Set a Windows username and password in the GCP Consol
E. Use the credentials to log in to the instance.
F. Set a Windows password in the GCP Consol
G. Verify that a firewall rule for port 22 exist
H. Click the RDP button in the GCP Console and supply the credentials to log in.
I. Set a Windows username and password in the GCP Consol

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

J. Verify that a firewall rule for port 3389 exist


K. Click the RDP button in the GCP Console, and supply the credentials to log in.

Answer: D

Explanation:
https://cloud.google.com/compute/docs/instances/connecting-to-windows#remote-desktop-connection-app
https://cloud.google.com/compute/docs/instances/windows/generating-credentials https://cloud.google.com/compute/docs/instances/connecting-to-
windows#before-you-begin

NEW QUESTION 94
You need to deploy an application in Google Cloud using savorless technology. You want to test a new version of the application with a small percentage of
production traffic. What should you do?

A. Deploy the application lo Clou


B. Ru
C. Use gradual rollouts for traffic spelling.
D. Deploy the application lo Google Kubemetes Engin
E. Use Anthos Service Mesh for traffic splitting.
F. Deploy the application to Cloud function
G. Saucily the version number in the functions name.
H. Deploy the application to App Engin
I. For each new version, create a new service.

Answer: A

NEW QUESTION 99
You need to host an application on a Compute Engine instance in a project shared with other teams. You want to prevent the other teams from accidentally
causing downtime on that application. Which feature should you use?

A. Use a Shielded VM.


B. Use a Preemptible VM.
C. Use a sole-tenant node.
D. Enable deletion protection on the instance.

Answer: D

Explanation:
As part of your workload, there might be certain VM instances that are critical to running your application or services, such as an instance running a SQL server, a
server used as a license manager, and so on. These VM instances might need to stay running indefinitely so you need a way to protect these VMs from being
deleted. By setting the deletionProtection flag, a VM instance can be protected from accidental deletion. If a user attempts to delete a VM instance for which you
have set the deletionProtection flag, the request fails. Only a user that has been granted a role with compute.instances.create permission can reset the flag to
allow the resource to be deleted.Ref: https://cloud.google.com/compute/docs/instances/preventing-accidental-vm-deletion

NEW QUESTION 103


Users of your application are complaining of slowness when loading the application. You realize the slowness is because the App Engine deployment serving the
application is deployed in us-central whereas all users of this application are closest to europe-west3. You want to change the region of the App Engine application
to europe-west3 to minimize latency. What’s the best way to change the App Engine region?

A. Create a new project and create an App Engine instance in europe-west3


B. Use the gcloud app region set command and supply the name of the new region.
C. From the console, under the App Engine page, click edit, and change the region drop-down.
D. Contact Google Cloud Support and request the change.

Answer: A

Explanation:
App engine is a regional service, which means the infrastructure that runs your app(s) is located in a specific region and is managed by Google to be redundantly
available across all the zones within that region. Once an app engine deployment is created in a region, it cant be changed. The only way is to create a new project
and create an App Engine instance in europe-west3, send all user traffic to this instance and delete the app engine instance in us-central.
Ref: https://cloud.google.com/appengine/docs/locations

NEW QUESTION 107


You have an application that receives SSL-encrypted TCP traffic on port 443. Clients for this application are located all over the world. You want to minimize
latency for the clients. Which load balancing option should you use?

A. HTTPS Load Balancer


B. Network Load Balancer
C. SSL Proxy Load Balancer
D. Internal TCP/UDP Load Balance
E. Add a firewall rule allowing ingress traffic from 0.0.0.0/0 on the target instances.

Answer: C

NEW QUESTION 110


Your company publishes large files on an Apache web server that runs on a Compute Engine instance. The Apache web server is not the only application running
in the project. You want to receive an email when the egress network costs for the server exceed 100 dollars for the current month as measured by Google Cloud
Platform (GCP). What should you do?

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

A. Set up a budget alert on the project with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”
B. Set up a budget alert on the billing account with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”
C. Export the billing data to BigQuer
D. Create a Cloud Function that uses BigQuery to sum the egress network costs of the exported billing data for the Apache web server for the current month and
sends an email if it is over 100 dollar
E. Schedule the Cloud Function using Cloud Scheduler to run hourly.
F. Use the Stackdriver Logging Agent to export the Apache web server logs to Stackdriver Logging.Create a Cloud Function that uses BigQuery to parse the HTTP
response log data in Stackdriver for thecurrent month and sends an email if the size of all HTTP responses, multiplied by current GCP egress prices, totals over
100 dollar
G. Schedule the Cloud Function using Cloud Scheduler to run hourly.

Answer: C

Explanation:
https://blog.doit-intl.com/the-truth-behind-google-cloud-egress-traffic-6e8f57b5c2f8

NEW QUESTION 115


You are developing a new web application that will be deployed on Google Cloud Platform. As part of your release cycle, you want to test updates to your
application on a small portion of real user traffic. The majority of the users should still be directed towards a stable version of your application. What should you
do?

A. Deploy me application on App Engine For each update, create a new version of the same service Configure traffic splitting to send a small percentage of traffic
to the new version
B. Deploy the application on App Engine For each update, create a new service Configure traffic splitting to send a small percentage of traffic to the new service.
C. Deploy the application on Kubernetes Engine For a new release, update the deployment to use the new version
D. Deploy the application on Kubernetes Engine For a now release, create a new deployment for the new version Update the service e to use the now deployment.

Answer: D

Explanation:
Keyword, Version, traffic splitting, App Engine supports traffic splitting for versions before releasing.

NEW QUESTION 120


You are working with a Cloud SQL MySQL database at your company. You need to retain a month-end copy of the database for three years for audit purposes.
What should you do?

A. Save file automatic first-of-the- month backup for three years Store the backup file in an Archive class Cloud Storage bucket
B. Convert the automatic first-of-the-month backup to an export file Write the export file to a Coldline class Cloud Storage bucket
C. Set up an export job for the first of the month Write the export file to an Archive class Cloud Storage bucket
D. Set up an on-demand backup tor the first of the month Write the backup to an Archive class Cloud Storage bucket

Answer: C

Explanation:
https://cloud.google.com/sql/docs/mysql/backup-recovery/backups#can_i_export_a_backup https://cloud.google.com/sql/docs/mysql/import-
export#automating_export_operations

NEW QUESTION 123


You want to verify the IAM users and roles assigned within a GCP project named my-project. What should you do?

A. Run gcloud iam roles lis


B. Review the output section.
C. Run gcloud iam service-accounts lis
D. Review the output section.
E. Navigate to the project and then to the IAM section in the GCP Consol
F. Review the members and roles.
G. Navigate to the project and then to the Roles section in the GCP Consol
H. Review the roles and status.

Answer: C

Explanation:
Logged onto console and followed the steps and was able to see all the assigned users and roles.

NEW QUESTION 128


You need to assign a Cloud Identity and Access Management (Cloud IAM) role to an external auditor. The auditor needs to have permissions to review your
Google Cloud Platform (GCP) Audit Logs and also to review your Data Access logs. What should you do?

A. Assign the auditor the IAM role roles/logging.privateLogViewe


B. Perform the export of logs to Cloud Storage.
C. Assign the auditor the IAM role roles/logging.privateLogViewe
D. Direct the auditor to also review the logs for changes to Cloud IAM policy.
E. Assign the auditor’s IAM user to a custom role that has logging.privateLogEntries.list permissio
F. Perform the export of logs to Cloud Storage.
G. Assign the auditor’s IAM user to a custom role that has logging.privateLogEntries.list permissio
H. Direct the auditor to also review the logs for changes to Cloud IAM policy.

Answer: B

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

Explanation:
Google Cloud provides Cloud Audit Logs, which is an integral part of Cloud Logging. It consists of two log streams for each project: Admin Activity and Data
Access, which are generated by Google Cloud services to help you answer the question of who did what, where, and when? within your Google Cloud projects.
Ref: https://cloud.google.com/iam/docs/job-functions/auditing#scenario_external_auditors

NEW QUESTION 133


After a recent security incident, your startup company wants better insight into what is happening in the Google Cloud environment. You need to monitor
unexpected firewall changes and instance creation. Your company prefers simple solutions. What should you do?

A. Use Cloud Logging filters to create log-based metrics for firewall and instance action
B. Monitor the changes and set up reasonable alerts.
C. Install Kibana on a compute Instanc
D. Create a log sink to forward Cloud Audit Logs filtered for firewalls and compute instances to Pub/Su
E. Target the Pub/Sub topic to push messages to the Kibana instanc
F. Analyze the logs on Kibana in real time.
G. Turn on Google Cloud firewall rules logging, and set up alerts for any insert, update, or delete events.
H. Create a log sink to forward Cloud Audit Logs filtered for firewalls and compute instances to Cloud Storage.Use BigQuery to periodically analyze log events in
the storage bucket.

Answer: A

Explanation:
This answer is the simplest and most effective way to monitor unexpected firewall changes and instance creation in Google Cloud. Cloud Logging filters allow you
to specify the criteria for the log entries that you want to view or export. You can use the Logging query language to write filters based on the LogEntry fields, such
as resource.type, severity, or protoPayload.methodName. For example, you can filter for firewall-related events by using the following query:
resource.type=“gce_subnetwork” logName=“projects/PROJECT_ID/logs/compute.googleapis.com%2Ffirewall”
You can filter for instance-related events by using the following query: resource.type=“gce_instance”
logName=“projects/PROJECT_ID/logs/compute.googleapis.com%2Factivity_log”
You can create log-based metrics from these filters to measure the rate or count of log entries that match the filter. Log-based metrics can be used to create charts
and dashboards in Cloud Monitoring, or to set up alerts based on the metric values. For example, you can create an alert policy that triggers when the log-based
metric for firewall changes exceeds a certain threshold in a given time interval. This way, you can get notified of any unexpected or malicious changes to your
firewall rules.
Option B is incorrect because it is unnecessarily complex and costly. Installing Kibana on a compute instance requires additional configuration and maintenance.
Creating a log sink to forward Cloud Audit Logs to Pub/Sub also incurs additional charges for the Pub/Sub service. Analyzing the logs on Kibana in real time may
not be feasible or efficient, as it requires constant monitoring and manual intervention.
Option C is incorrect because Google Cloud firewall rules logging is a different feature from Cloud Audit Logs. Firewall rules logging allows you to audit, verify, and
analyze the effects of your firewall rules by creating connection records for each rule that applies to traffic. However, firewall rules logging does not log the insert,
update, or delete events for the firewall rules themselves. Those events are logged by Cloud Audit Logs, which record the administrative activities in your Google
Cloud project.
Option D is incorrect because it is not a real-time solution. Creating a log sink to forward Cloud Audit Logs to Cloud Storage requires additional storage space and
charges. Using BigQuery to periodically analyze log events in the storage bucket also incurs additional costs for the BigQuery service. Moreover, this option does
not provide any alerting mechanism to notify you of any unexpected or malicious changes to your firewall rules or instances.

NEW QUESTION 137


You are running a data warehouse on BigQuery. A partner company is offering a recommendation engine based on the data in your data warehouse. The partner
company is also running their application on Google Cloud. They manage the resources in their own project, but they need access to the BigQuery dataset in your
project. You want to provide the partner company with access to the dataset What should you do?

A. Create a Service Account in your own project, and grant this Service Account access to BigGuery in your project
B. Create a Service Account in your own project, and ask the partner to grant this Service Account access to BigQuery in their project
C. Ask the partner to create a Service Account in their project, and have them give the Service Account access to BigQuery in their project
D. Ask the partner to create a Service Account in their project, and grant their Service Account access to the BigQuery dataset in your project

Answer: D

Explanation:
https://gtseres.medium.com/using-service-accounts-across-projects-in-gcp-cf9473fef8f0#:~:text=Go%20to%20t

NEW QUESTION 141


You recently deployed a new version of an application to App Engine and then discovered a bug in the release. You need to immediately revert to the prior version
of the application. What should you do?

A. Run gcloud app restore.


B. On the App Engine page of the GCP Console, select the application that needs to be reverted and click Revert.
C. On the App Engine Versions page of the GCP Console, route 100% of the traffic to the previous version.
D. Deploy the original version as a separate applicatio
E. Then go to App Engine settings and split traffic between applications so that the original version serves 100% of the requests.

Answer: C

NEW QUESTION 142


You have created a new project in Google Cloud through the gcloud command line interface (CLI) and linked a billing account. You need to create a new Compute
Engine instance using the CLI. You need to perform the prerequisite steps. What should you do?

A. Create a Cloud Monitoring Workspace.


B. Create a VPC network in the project.
C. Enable the compute googleapis.com API.
D. Grant yourself the IAM role of Compute Admin.

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

Answer: D

NEW QUESTION 145


You are assisting a new Google Cloud user who just installed the Google Cloud SDK on their VM. The server needs access to Cloud Storage. The user wants your
help to create a new storage bucket. You need to make this change in multiple environments. What should you do?

A. Use a Deployment Manager script to automate creating storage buckets in an appropriate region
B. Use a local SSD to improve performance of the VM for the targeted workload
C. Use the gsutii command to create a storage bucket in the same region as the VM
D. Use a Persistent Disk SSD in the same zone as the VM to improve performance of the VM

Answer: A

NEW QUESTION 150


Your management has asked an external auditor to review all the resources in a specific project. The security team has enabled the Organization Policy called
Domain Restricted Sharing on the organization node by specifying only your Cloud Identity domain. You want the auditor to only be able to view, but not modify,
the resources in that project. What should you do?

A. Ask the auditor for their Google account, and give them the Viewer role on the project.
B. Ask the auditor for their Google account, and give them the Security Reviewer role on the project.
C. Create a temporary account for the auditor in Cloud Identity, and give that account the Viewer role on the project.
D. Create a temporary account for the auditor in Cloud Identity, and give that account the Security Reviewer role on the project.

Answer: C

Explanation:
Using primitive roles The following table lists the primitive roles that you can grant to access a project, the description of what the role does, and the permissions
bundled within that role. Avoid using primitive roles except when absolutely necessary. These roles are very powerful, and include a large number of permissions
across all Google Cloud services. For more details on when you should use primitive roles, see the Identity and Access Management FAQ. IAM predefined roles
are much more granular, and allow you to carefully manage the set of permissions that your users have access to. See Understanding Roles for a list of roles that
can be granted at the project level. Creating custom roles can further increase the control you have over user permissions. https://cloud.google.com/resource-
manager/docs/access-control-proj#using_primitive_roles
https://cloud.google.com/iam/docs/understanding-custom-roles

NEW QUESTION 152


You need to select and configure compute resources for a set of batch processing jobs. These jobs take around 2 hours to complete and are run nightly. You want
to minimize service costs. What should you do?

A. Select Google Kubernetes Engin


B. Use a single-node cluster with a small instance type.
C. Select Google Kubernetes Engin
D. Use a three-node cluster with micro instance types.
E. Select Compute Engin
F. Use preemptible VM instances of the appropriate standard machine type.
G. Select Compute Engin
H. Use VM instance types that support micro bursting.

Answer: C

Explanation:
If your apps are fault-tolerant and can withstand possible instance preemptions, then preemptible instances can reduce your Compute Engine costs significantly.
For example, batch processing jobs can run on preemptible instances. If some of those instances stop during processing, the job slows but does not completely
stop. Preemptible instances complete your batch processing tasks without placing additional workload on your existing instances and without requiring you to pay
full price for additional normal instances.
https://cloud.google.com/compute/docs/instances/preemptible

NEW QUESTION 156


You are migrating a business critical application from your local data center into Google Cloud. As part of your high-availability strategy, you want to ensure that
any data used by the application will be immediately available if a zonal failure occurs. What should you do?

A. Store the application data on a zonal persistent dis


B. Create a snapshot schedule for the dis
C. If an outage occurs, create a new disk from the most recent snapshot and attach it to a new VM in another zone.
D. Store the application data on a zonal persistent dis
E. If an outage occurs, create an instance in another zone with this disk attached.
F. Store the application data on a regional persistent dis
G. Create a snapshot schedule for the dis
H. If an outage occurs, create a new disk from the most recent snapshot and attach it to a new VM in another zone.
I. Store the application data on a regional persistent disk If an outage occurs, create an instance in another zone with this disk attached.

Answer: A

NEW QUESTION 158


Your company set up a complex organizational structure on Google Could Platform. The structure includes hundreds of folders and projects. Only a few team
members should be able to view the hierarchical structure. You need to assign minimum permissions to these team members and you want to follow Google-
recommended practices. What should you do?

A. Add the users to roles/browser role.

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

B. Add the users to roles/iam.roleViewer role.


C. Add the users to a group, and add this group to roles/browser role.
D. Add the users to a group, and add this group to roles/iam.roleViewer role.

Answer: C

Explanation:
We need to apply the GCP Best practices. roles/browser Browser Read access to browse the hierarchy for a project, including the folder, organization, and IAM
policy. This role doesn't include permission to view resources in the project. https://cloud.google.com/iam/docs/understanding-roles

NEW QUESTION 162


......

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Recommend!! Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)

Thank You for Trying Our Product

We offer two products:

1st - We have Practice Tests Software with Actual Exam Questions

2nd - Questons and Answers in PDF Format

Associate-Cloud-Engineer Practice Exam Features:

* Associate-Cloud-Engineer Questions and Answers Updated Frequently

* Associate-Cloud-Engineer Practice Questions Verified by Expert Senior Certified Staff

* Associate-Cloud-Engineer Most Realistic Questions that Guarantee you a Pass on Your FirstTry

* Associate-Cloud-Engineer Practice Test Questions in Multiple Choice Formats and Updatesfor 1 Year

100% Actual & Verified — Instant Download, Please Click


Order The Associate-Cloud-Engineer Practice Test Here

Passing Certification Exams Made Easy visit - https://www.surepassexam.com


Powered by TCPDF (www.tcpdf.org)

You might also like