0% found this document useful (0 votes)
22 views114 pages

Most Voted

Uploaded by

Nirupam Kothari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views114 pages

Most Voted

Uploaded by

Nirupam Kothari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 114

Q167) Your company developed a mobile game that is deployed on Google Cloud.

Gamers are connecting to the game with their personal phones over the Internet. The
game sends UDP packets to update the servers about the gamers' actions while they
are playing in multiplayer mode. Your game backend can scale over multiple virtual
machines (VMs), and you want to expose the VMs over a single IP address. What
should you do?

​ A. Configure an SSL Proxy load balancer in front of the application servers.


​ B. Configure an Internal UDP load balancer in front of the application servers.
​ C. Configure an External HTTP(s) load balancer in front of the application
servers.
​ D. Configure an External Network load balancer in front of the application
servers. Most Voted

Q168.) You are working for a hospital that stores its medical images in an on-premises
data room. The hospital wants to use Cloud Storage for archival storage of these
images. The hospital wants an automated process to upload any new medical images
to Cloud Storage. You need to design and implement a solution. What should you do?

​ A. Create a Pub/Sub topic, and enable a Cloud Storage trigger for the Pub/Sub
topic. Create an application that sends all medical images to the Pub/Sub topic.
​ B. Deploy a Dataflow job from the batch template, ‫ג‬€Datastore to Cloud
Storage.‫ג‬€ Schedule the batch job on the desired interval.
​ C. Create a script that uses the gsutil command line interface to
synchronize the on-premises storage with Cloud Storage. Schedule the
script as a cron job. Most Voted
​ D. In the Cloud Console, go to Cloud Storage. Upload the relevant images to the
appropriate bucket.
Q169.)Your auditor wants to view your organization's use of data in Google Cloud. The
auditor is most interested in auditing who accessed data in Cloud Storage buckets. You
need to help the auditor access the data they need. What should you do?

​ A. Turn on Data Access Logs for the buckets they want to audit, and then
build a query in the log viewer that filters on Cloud Storage. Most Voted
​ B. Assign the appropriate permissions, and then create a Data Studio report on
Admin Activity Audit Logs.
​ C. Assign the appropriate permissions, and then use Cloud Monitoring to review
metrics.
​ D. Use the export logs API to provide the Admin Activity Audit Logs in the format
they want.

Q170.)You received a JSON file that contained a private key of a Service Account in
order to get access to several resources in a Google Cloud project. You downloaded
and installed the Cloud SDK and want to use this private key for authentication and
authorization when performing gcloud commands. What should you do?

A. Use the command gcloud auth login and point it to the private key.

B. Use the command gcloud auth activate-service-account and point it to the


private key. Most Voted

C. Place the private key file in the installation directory of the Cloud SDK and rename it
to ‫ג‬€credentials.json‫ג‬€.

D. Place the private key file in your home directory and rename it to
‫ג‬€GOOGLE_APPLICATION_CREDENTIALS‫ג‬€.
Q171.) You are working with a Cloud SQL MySQL database at your company. You need
to retain a month-end copy of the database for three years for audit purposes.
What should you do?

​ A. Set up an export job for the first of the month. Write the export file to an
Archive class Cloud Storage bucket. Most Voted

​ B. Save the automatic first-of-the-month backup for three years. Store the
backup file in an Archive class Cloud Storage bucket.

​ C. Set up an on-demand backup for the first of the month. Write the backup to an
Archive class Cloud Storage bucket.

​ D. Convert the automatic first-of-the-month backup to an export file. Write the


export file to a Coldline class Cloud Storage bucket.

Q172.)You are monitoring an application and receive user feedback that a specific error
is spiking. You notice that the error is caused by a Service Account having insufficient
permissions. You are able to solve the problem but want to be notified if the problem
recurs. What should you do?

​ A. In the Log Viewer, filter the logs on severity 'Error' and the name of the Service
Account.
​ B. Create a sink to BigQuery to export all the logs. Create a Data Studio
dashboard on the exported logs.
​ C. Create a custom log-based metric for the specific error to be used in an
Alerting Policy. Most Voted
​ D. Grant Project Owner access to the Service Account.

Q173.) You are working with a client who plans to migrate their data to Google Cloud.
You are responsible for recommending an encryption service to manage their encrypted
keys. You have the following requirements:
✑ The master key must be rotated at least once every 45 days.
✑ The solution that stores the master key must be FIPS 140-2 Level 3 validated.
✑ The master key must be stored in multiple regions within the US for redundancy.
Which solution meets these requirements?

​ A. Customer-managed encryption keys with Cloud Key Management Service


​ B. Customer-managed encryption keys with Cloud HSM Most Voted
​ C. Customer-supplied encryption keys
​ D. Google-managed encryption keys

Q174.)You are about to deploy a new Enterprise Resource Planning (ERP) system on
Google Cloud. The application holds the full database in-memory for fast data access,
and you need to configure the most appropriate resources on Google Cloud for this
application. What should you do?

​ A. Provision preemptible Compute Engine instances.


​ B. Provision Compute Engine instances with GPUs attached.
​ C. Provision Compute Engine instances with local SSDs attached.
​ D. Provision Compute Engine instances with M1 machine type. Most Voted

Q175.) You have developed an application that consists of multiple microservices, with
each microservice packaged in its own Docker container image. You want to deploy the
entire application on Google Kubernetes Engine so that each microservice can be
scaled individually. What should you do?

​ A. Create and deploy a Custom Resource Definition per microservice.


​ B. Create and deploy a Docker Compose File.
​ C. Create and deploy a Job per microservice.
​ D. Create and deploy a Deployment per microservice. Most Voted

Q176.) You will have several applications running on different Compute Engine
instances in the same project. You want to specify at a more granular level the service
account each instance uses when calling Google Cloud APIs. What should you do?
​ A. When creating the instances, specify a Service Account for each
instance. Most Voted
​ B. When creating the instances, assign the name of each Service Account as
instance metadata.
​ C. After starting the instances, use gcloud compute instances update to specify a
Service Account for each instance.
​ D. After starting the instances, use gcloud compute instances update to assign
the name of the relevant Service Account as instance metadata

Q177.) You are creating an application that will run on Google Kubernetes Engine. You
have identified MongoDB as the most suitable database system for your application and
want to deploy a managed MongoDB environment that provides a support SLA. What
should you do?

​ A. Create a Cloud Bigtable cluster, and use the HBase API.


​ B. Deploy MongoDB Atlas from the Google Cloud Marketplace. Most Voted
​ C. Download a MongoDB installation package, and run it on Compute Engine
instances.
​ D. Download a MongoDB installation package, and run it on a Managed Instance
Group.

Q178.)You are managing a project for the Business Intelligence (BI) department in your
company. A data pipeline ingests data into BigQuery via streaming. You want the users
in the BI department to be able to run the custom SQL queries against the latest data in
BigQuery. What should you do?

​ A. Create a Data Studio dashboard that uses the related BigQuery tables as a
source and give the BI team view access to the Data Studio dashboard.
​ B. Create a Service Account for the BI team and distribute a new private key to
each member of the BI team.
​ C. Use Cloud Scheduler to schedule a batch Dataflow job to copy the data from
BigQuery to the BI team's internal data warehouse.
​ D. Assign the IAM role of BigQuery User to a Google Group that contains
the members of the BI team. Most Voted

Q179.)Your company is moving its entire workload to Compute Engine. Some servers
should be accessible through the Internet, and other servers should only be accessible
over the internal network. All servers need to be able to talk to each other over specific
ports and protocols. The current on-premises network relies on a demilitarized zone
(DMZ) for the public servers and a Local Area Network (LAN) for the private servers.
You need to design the networking infrastructure on
Google Cloud to match these requirements. What should you do?

​ A. 1. Create a single VPC with a subnet for the DMZ and a subnet for the
LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ
and the LAN subnets, and another firewall rule to allow public ingress
traffic for the DMZ. Most Voted
​ B. 1. Create a single VPC with a subnet for the DMZ and a subnet for the LAN. 2.
Set up firewall rules to open up relevant traffic between the DMZ and the LAN
subnets, and another firewall rule to allow public egress traffic for the DMZ.
​ C. 1. Create a VPC with a subnet for the DMZ and another VPC with a subnet for
the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and
the LAN subnets, and another firewall rule to allow public ingress traffic for the
DMZ.
​ D. 1. Create a VPC with a subnet for the DMZ and another VPC with a subnet for
the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and
the LAN subnets, and another firewall rule to allow public egress traffic for the
DMZ.

Q180.) You have just created a new project which will be used to deploy a globally
distributed application. You will use Cloud Spanner for data storage. You want to create
a Cloud Spanner instance. You want to perform the first step in preparation of creating
the instance. What should you do?

​ A. Enable the Cloud Spanner API. Most Voted


​ B. Configure your Cloud Spanner instance to be multi-regional.
​ C. Create a new VPC network with subnetworks in all desired regions.
​ D. Grant yourself the IAM role of Cloud Spanner Admin.

Q181.)You have created a new project in Google Cloud through the gcloud
command line interface (CLI) and linked a billing account. You need to create a
new Compute
Engine instance using the CLI. You need to perform the prerequisite steps. What
should you do?

​ A. Create a Cloud Monitoring Workspace.


​ B. Create a VPC network in the project.
​ C. Enable the compute googleapis.com API. Most Voted
​ D. Grant yourself the IAM role of Computer Admin.

Q182.)Your company has developed a new application that consists of multiple


microservices. You want to deploy the application to Google Kubernetes Engine (GKE),
and you want to ensure that the cluster can scale as more applications are deployed in
the future. You want to avoid manual intervention when each new application is
deployed. What should you do?

​ A. Deploy the application on GKE, and add a HorizontalPodAutoscaler to the


deployment.
​ B. Deploy the application on GKE, and add a VerticalPodAutoscaler to the
deployment.
​ C. Create a GKE cluster with autoscaling enabled on the node pool. Set a
minimum and maximum for the size of the node pool. Most Voted
​ D. Create a separate node pool for each application, and deploy each application
to its dedicated node pool.

Q183.) You need to manage a third-party application that will run on a Compute Engine
instance. Other Compute Engine instances are already running with default
configuration. Application installation files are hosted on Cloud Storage. You need to
access these files from the new instance without allowing other virtual machines (VMs)
to access these files. What should you do?

​ A. Create the instance with the default Compute Engine service account. Grant
the service account permissions on Cloud Storage.
​ B. Create the instance with the default Compute Engine service account. Add
metadata to the objects on Cloud Storage that matches the metadata on the new
instance.
​ C. Create a new service account and assign this service account to the new
instance. Grant the service account permissions on Cloud Storage. Most
Voted

​ D. Create a new service account and assign this service account to the new
instance. Add metadata to the objects on Cloud Storage that matches the
metadata on the new instance

Q184.) You need to configure optimal data storage for files stored in Cloud Storage for
minimal cost. The files are used in a mission-critical analytics pipeline that is used
continually. The users are in Boston, MA (United States). What should you do?

​ A. Configure regional storage for the region closest to the users. Configure a
Nearline storage class.
​ B. Configure regional storage for the region closest to the users. Configure
a Standard storage class. Most Voted
​ C. Configure dual-regional storage for the dual region closest to the users.
Configure a Nearline storage class.
​ D. Configure dual-regional storage for the dual region closest to the users.
Configure a Standard storage class.

Q185.) You are developing a new web application that will be deployed on Google
Cloud Platform. As part of your release cycle, you want to test updates to your
application on a small portion of real user traffic. The majority of the users should still be
directed towards a stable version of your application. What should you do?
​ A. Deploy the application on App Engine. For each update, create a new
version of the same service. Configure traffic splitting to send a small
percentage of traffic to the new version. Most Voted
​ B. Deploy the application on App Engine. For each update, create a new service.
Configure traffic splitting to send a small percentage of traffic to the new service.
​ C. Deploy the application on Kubernetes Engine. For a new release, update the
deployment to use the new version.
​ D. Deploy the application on Kubernetes Engine. For a new release, create a
new deployment for the new version. Update the service to use the new
deployment.

Q186.) You need to add a group of new users to Cloud Identity. Some of the users
already have existing Google accounts. You want to follow one of Google's
recommended practices and avoid conflicting accounts. What should you do?

​ A. Invite the user to transfer their existing account.


​ B. Invite the user to use an email alias to resolve the conflict.
​ C. Tell the user that they must delete their existing account.
​ D. Tell the user to remove all personal email from the existing account.

Q187.) You need to manage a Cloud Spanner instance for best query performance.
Your instance in production runs in a single Google Cloud region. You need to improve
performance in the shortest amount of time. You want to follow Google best practices
for service configuration. What should you do?

A. Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU
utilization reaches 45%. If you exceed this threshold, add nodes to your instance.

B. Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU
utilization reaches 45%. Use database query statistics to identify queries that result in
high CPU usage, and then rewrite those queries to optimize their resource usage.
C. Create an alert in Cloud Monitoring to alert when the percentage of high
priority CPU utilization reaches 65%. If you exceed this threshold, add nodes to
your instance. Most Voted

D. Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU
utilization reaches 65%. Use database query statistics to identify queries that result in
high CPU usage, and then rewrite those queries to optimize their resource usage.

Q188.)Your company has an internal application for managing transactional


orders. The application is used exclusively by employees in a single physical
location. The application requires strong consistency, fast queries, and ACID
guarantees for multi-table transactional updates. The first version of the
application is implemented in PostgreSQL, and you want to deploy it to the cloud
with minimal code changes. Which database is most appropriate for this
application?

​ A. BigQuery
​ B. Cloud SQL Most Voted
​ C. Cloud Spanner
​ D. Cloud Datastore

Q189.) You are assigned to maintain a Google Kubernetes Engine (GKE) cluster named
'dev' that was deployed on Google Cloud. You want to manage the GKE configuration
using the command line interface (CLI). You have just downloaded and installed the
Cloud SDK. You want to ensure that future CLI commands by default address this
specific cluster What should you do?

​ A. Use the command gcloud config set container/cluster dev. Most Voted
​ B. Use the command gcloud container clusters update dev.
​ C. Create a file called gke.default in the ~/.gcloud folder that contains the cluster
name.
​ D. Create a file called defaults.json in the ~/.gcloud folder that contains the
cluster name.

Q190.) The sales team has a project named Sales Data Digest that has the ID
acme-data-digest. You need to set up similar Google Cloud resources for the marketing
team but their resources must be organized independently of the sales team. What
should you do?

​ A. Grant the Project Editor role to the Marketing team for acme-data-digest.
​ B. Create a Project Lien on acme-data-digest and then grant the Project Editor
role to the Marketing team.
​ C. Create another project with the ID acme-marketing-data-digest for the
Marketing team and deploy the resources there. Most Voted
​ D. Create a new project named Marketing Data Digest and use the ID
acme-data-digest. Grant the Project Editor role to the Marketing team.

190. The sales team has a project named Sales Data Digest that has the ID
acme-data-digest. You need to set up similar Google Cloud resources for the marketing
team but their resources must be organized independently of the sales team. What
should you do?

​ A. Grant the Project Editor role to the Marketing team for acme-data-digest.
​ B. Create a Project Lien on acme-data-digest and then grant the Project Editor
role to the Marketing team.
​ C. Create another project with the ID acme-marketing-data-digest for the
Marketing team and deploy the resources there. Most Voted
​ D. Create a new project named Marketing Data Digest and use the ID
acme-data-digest. Grant the Project Editor role to the Marketing team.
191. You have deployed multiple Linux instances on Compute Engine. You plan on
adding more instances in the coming weeks. You want to be able to access all of these
instances through your SSH client over the internet without having to configure specific
access on the existing and new instances. You do not want the
Compute Engine instances to have a public IP. What should you do?

​ A. Configure Cloud Identity-Aware Proxy for HTTPS resources.


​ B. Configure Cloud Identity-Aware Proxy for SSH and TCP resources Most
Voted

​ C. Create an SSH keypair and store the public key as a project-wide SSH Key.
​ D. Create an SSH keypair and store the private key as a project-wide SSH Key.

193. You are using Data Studio to visualize a table from your data warehouse that is
built on top of BigQuery. Data is appended to the data warehouse during the day.
At night, the daily summary is recalculated by overwriting the table. You just noticed that
the charts in Data Studio are broken, and you want to analyze the problem. What
should you do?

​ A. Review the Error Reporting page in the Cloud Console to find any errors.
​ B. Use the BigQuery interface to review the nightly job and look for any
errors. Most Voted
​ C. Use Cloud Debugger to find out why the data was not refreshed correctly. Most
Voted

​ D. In Cloud Logging, create a filter for your Data Studio report. Most Voted

194. You have been asked to set up the billing configuration for a new Google Cloud
customer. Your customer wants to group resources that share common IAM policies.
What should you do?

​ A. Use labels to group resources that share common IAM policies.


​ B. Use folders to group resources that share common IAM policies. Most
Voted

​ C. Set up a proper billing account structure to group IAM policies.


​ D. Set up a proper project naming structure to group IAM policies.
195. You have been asked to create robust Virtual Private Network (VPN) connectivity
between a new Virtual Private Cloud (VPC) and a remote site. Key requirements include
dynamic routing, a shared address space of 10.19.0.1/22, and no overprovisioning of
tunnels during a failover event. You want to follow Google- recommended practices to
set up a high availability Cloud VPN. What should you do?

​ A. Use a custom mode VPC network, configure static routes, and use
active/passive routing.

​ B. Use an automatic mode VPC network, configure static routes, and use
active/active routing.

​ C. Use a custom mode VPC network, use Cloud Router border gateway
protocol (BGP) routes, and use active/passive routing. Most Voted

​ D. Use an automatic mode VPC network, use Cloud Router border gateway
protocol (BGP) routes, and configure policy-based routing.

196. You are running multiple microservices in a Kubernetes Engine cluster. One
microservice is rendering images. The microservice responsible for the image rendering
requires a large amount of CPU time compared to the memory it requires. The other
microservices are workloads that are optimized for n1-standard machine types. You
need to optimize your cluster so that all workloads are using resources as efficiently as
possible. What should you do?

​ A. Assign the pods of the image rendering microservice a higher pod priority than
the other microservices.

​ B. Create a node pool with compute-optimized machine type nodes for the
image rendering microservice. Use the node pool with general-purpose
machine type nodes for the other microservices. Most Voted

​ C. Use the node pool with general-purpose machine type nodes for the image
rendering microservice. Create a node pool with compute-optimized machine
type nodes for the other microservices.
​ D. Configure the required amount of CPU and memory in the resource requests
specification of the image rendering microservice deployment. Keep the resource
requests for the other microservices at the default.

197. Your organization has three existing Google Cloud projects. You need to bill the
Marketing department for only their Google Cloud services for a new initiative within
their group. What should you do?

​ A. 1. Verify that you are assigned the Billing Administrator IAM role for your
organization's Google Cloud Project for the Marketing department. 2. Link the
new project to a Marketing Billing Account. Most Voted
​ B. 1. Verify that you are assigned the Billing Administrator IAM role for your
organization's Google Cloud account. 2. Create a new Google Cloud Project for
the Marketing department. 3. Set the default key-value project labels to
department:marketing for all services in this project. Most Voted
​ C. 1. Verify that you are assigned the Organization Administrator IAM role
for your organization's Google Cloud account. 2. Create a new Google
Cloud Project for the Marketing department. 3. Link the new project to a
Marketing Billing Account.
​ D. 1. Verify that you are assigned the Organization Administrator IAM role for
your organization's Google Cloud account. 2. Create a new Google Cloud Project
for the Marketing department. 3. Set the default key-value project labels to
department:marketing for all services in this project.

198. You deployed an application on a managed instance group in Compute Engine.


The application accepts Transmission Control Protocol (TCP) traffic on port 389 and
requires you to preserve the IP address of the client who is making a request. You want
to expose the application to the internet by using a load balancer. What should you do?

​ A. Expose the application by using an external TCP Network Load


Balancer. Most Voted
​ B. Expose the application by using a TCP Proxy Load Balancer.
​ C. Expose the application by using an SSL Proxy Load Balancer.
​ D. Expose the application by using an internal TCP Network Load Balancer.
199. You are building a multi-player gaming application that will store game information
in a database. As the popularity of the application increases, you are concerned about
delivering consistent performance. You need to ensure an optimal gaming performance
for global users, without increasing the management complexity. What should you do?

​ A. Use Cloud SQL database with cross-region replication to store game statistics
in the EU, US, and APAC regions.
​ B. Use Cloud Spanner to store user data mapped to the game statistics.
Most Voted

​ C. Use BigQuery to store game statistics with a Redis on Memorystore instance


in the front to provide global consistency.
​ D. Store game statistics in a Bigtable database partitioned by username.

200. You are building an application that stores relational data from users. Users across
the globe will use this application. Your CTO is concerned about the scaling
requirements because the size of the user base is unknown. You need to implement a
database solution that can scale with your user growth with minimum configuration
changes. Which storage solution should you use?

​ A. Cloud SQL
​ B. Firestore
​ C. Cloud Spanner Most Voted
​ D. Bigtable

201. Your company has multiple projects linked to a single billing account in Google
Cloud. You need to visualize the costs with specific metrics that should be dynamically
calculated based on company-specific criteria. You want to automate the process. What
should you do?

​ A. In the Google Cloud console, visualize the costs related to the projects in the
Reports section.
​ B. In the Google Cloud console, visualize the costs related to the projects in the
Cost breakdown section.
​ C. In the Google Cloud console, use the export functionality of the Cost table.
Create a Looker Studio dashboard on top of the CSV export.
​ D. Configure Cloud Billing data export to BigQuery for the billing account.
Create a Looker Studio dashboard on top of the BigQuery export. Most Voted
202. You have an application that runs on Compute Engine VM instances in a custom
Virtual Private Cloud (VPC). Your company’s security policies only allow the use of
internal IP addresses on VM instances and do not let VM instances connect to the
internet. You need to ensure that the application can access a file hosted in a Cloud
Storage bucket within your project. What should you do?

​ A. Enable Private Service Access on the Cloud Storage Bucket.


​ B. Add storage.googleapis.com to the list of restricted services in a VPC Service
Controls perimeter and add your project to the list of protected projects.
​ C. Enable Private Google Access on the subnet within the custom VPC. Most
Voted

​ D. Deploy a Cloud NAT instance and route the traffic to the dedicated IP address
of the Cloud Storage bucket.

203. Your company completed the acquisition of a startup and is now merging the IT
systems of both companies. The startup had a production Google Cloud project in their
organization. You need to move this project into your organization and ensure that the
project is billed to your organization. You want to accomplish this task with minimal
effort. What should you do?

​ A. Use the projects.move method to move the project to your organization.


Update the billing account of the project to that of your organization. Most
Voted

​ B. Ensure that you have an Organization Administrator Identity and Access


Management (IAM) role assigned to you in both organizations. Navigate to the
Resource Manager in the startup’s Google Cloud organization, and drag the
project to your company's organization.
​ C. Create a Private Catalog for the Google Cloud Marketplace, and upload the
resources of the startup's production project to the Catalog. Share the Catalog
with your organization, and deploy the resources in your company’s project.
​ D. Create an infrastructure-as-code template for all resources in the project by
using Terraform, and deploy that template to a new project in your organization.
Delete the project from the startup’s Google Cloud organization.

204. All development (dev) teams in your organization are located in the United States.
Each dev team has its own Google Cloud project. You want to restrict access so that
each dev team can only create cloud resources in the United States (US). What should
you do?
​ A. Create a folder to contain all the dev projects. Create an organization
policy to limit resources in US locations. Most Voted
​ B. Create an organization to contain all the dev projects. Create an Identity and
Access Management (IAM) policy to limit the resources in US regions.
​ C. Create an Identity and Access Management (IAM) policy to restrict the
resources locations in the US. Apply the policy to all dev projects.
​ D. Create an Identity and Access Management (IAM) policy to restrict the
resources locations in all dev projects. Apply the policy to all dev roles.

205. You are configuring Cloud DNS. You want to create DNS records to point
home.mydomain.com, mydomain.com, and www.mydomain.com to the IP address of
your Google Cloud load balancer. What should you do?

​ A. Create one CNAME record to point mydomain.com to the load balancer, and
create two A records to point WWW and HOME to mydomain.com respectively.
​ B. Create one CNAME record to point mydomain.com to the load balancer, and
create two AAAA records to point WWW and HOME to mydomain.com
respectively.
​ C. Create one A record to point mydomain.com to the load balancer, and
create two CNAME records to point WWW and HOME to mydomain.com
respectively. Most Voted
​ D. Create one A record to point mydomain.com to the load balancer, and create
two NS records to point WWW and HOME to mydomain.com respectively.

206. You have two subnets (subnet-a and subnet-b) in the default VPC. Your database
servers are running in subnet-a. Your application servers and web servers are running
in subnet-b. You want to configure a firewall rule that only allows database traffic from
the application servers to the database servers. What should you do?

​ A. • Create service accounts sa-app and sa-db.


• Associate service account sa-app with the application servers and the
service account sa-db with the database servers.
• Create an ingress firewall rule to allow network traffic from source service
account sa-app to target service account sa-db. Most Voted
​ B. • Create network tags app-server and db-server.
• Add the app-server tag to the application servers and the db-server tag to the
database servers.
• Create an egress firewall rule to allow network traffic from source network tag
app-server to target network tag db-server. Most Voted
​ C. • Create a service account sa-app and a network tag db-server.
• Associate the service account sa-app with the application servers and the
network tag db-server with the database servers.
• Create an ingress firewall rule to allow network traffic from source VPC IP
addresses and target the subnet-a IP addresses.
​ D. • Create a network tag app-server and service account sa-db.
• Add the tag to the application servers and associate the service account with
the database servers.
• Create an egress firewall rule to allow network traffic from source network tag
app-server to target service account sa-db.

207. Your team wants to deploy a specific content management system (CMS) solution
to Google Cloud. You need a quick and easy way to deploy and install the solution.
What should you do?

​ A. Search for the CMS solution in Google Cloud Marketplace. Use gcloud CLI to
deploy the solution.
​ B. Search for the CMS solution in Google Cloud Marketplace. Deploy the
solution directly from Cloud Marketplace. Most Voted
​ C. Search for the CMS solution in Google Cloud Marketplace. Use Terraform and
the Cloud Marketplace ID to deploy the solution with the appropriate parameters.
​ D. Use the installation guide of the CMS provider. Perform the installation
through your configuration management system.

208. You are working for a startup that was officially registered as a business 6 months
ago. As your customer base grows, your use of Google Cloud increases. You want to
allow all engineers to create new projects without asking them for their credit card
information. What should you do?

​ A. Create a Billing account, associate a payment method with it, and


provide all project creators with permission to associate that billing
account with their projects. Most Voted
​ B. Grant all engineers permission to create their own billing accounts for each
new project.
​ C. Apply for monthly invoiced billing, and have a single invoice for the project
paid by the finance team.
​ D. Create a billing account, associate it with a monthly purchase order (PO), and
send the PO to Google Cloud.

209. Your continuous integration and delivery (CI/CD) server can’t execute Google
Cloud actions in a specific project because of permission issues. You need to validate
whether the used service account has the appropriate roles in the specific project.

What should you do?

​ A. Open the Google Cloud console, and check the Identity and Access
Management (IAM) roles assigned to the service account at the project or
inherited from the folder or organization levels. Most Voted
​ B. Open the Google Cloud console, and check the organization policies.
​ C. Open the Google Cloud console, and run a query to determine which
resources this service account can access.
​ D. Open the Google Cloud console, and run a query of the audit logs to find
permission denied errors for this service account.

210. Your team is using Linux instances on Google Cloud. You need to ensure that your
team logs in to these instances in the most secure and cost efficient way. What should
you do?

​ A. Attach a public IP to the instances and allow incoming connections from the
internet on port 22 for SSH.
​ B. Use the gcloud compute ssh command with the --tunnel-through-iap
flag. Allow ingress traffic from the IP range 35.235.240.0/20 on port 22. Most
Voted

​ C. Use a third party tool to provide remote access to the instances.


​ D. Create a bastion host with public internet access. Create the SSH tunnel to
the instance through the bastion host.

211. An external member of your team needs list access to compute images and disks
in one of your projects. You want to follow Google-recommended practices when you
grant the required permissions to this user. What should you do?

​ A. Create a custom role, and add all the required compute.disks.list and
compute.images.list permissions as includedPermissions. Grant the
custom role to the user at the project level. Most Voted
​ B. Create a custom role based on the Compute Image User role. Add the
compute.disks.list to the includedPermissions field. Grant the custom role to the
user at the project level.
​ C. Create a custom role based on the Compute Storage Admin role. Exclude
unnecessary permissions from the custom role. Grant the custom role to the user
at the project level.
​ D. Grant the Compute Storage Admin role at the project level.

212. You are running a web application on Cloud Run for a few hundred users. Some of
your users complain that the initial web page of the application takes much longer to
load than the following pages. You want to follow Google’s recommendations to mitigate
the issue. What should you do?

​ A. Set the minimum number of instances for your Cloud Run service to 3.
Most Voted

​ B. Set the concurrency number to 1 for your Cloud Run service.


​ C. Set the maximum number of instances for your Cloud Run service to 100.
​ D. Update your web application to use the protocol HTTP/2 instead of HTTP/1.1.

213. You are building a data lake on Google Cloud for your Internet of Things (IoT)
application. The IoT application has millions of sensors that are constantly streaming
structured and unstructured data to your backend in the cloud. You want to build a
highly available and resilient architecture based on Google-recommended practices.
What should you do?

​ A. Stream data to Pub/Sub, and use Dataflow to send data to Cloud


Storage. Most Voted
​ B. Stream data to Pub/Sub, and use Storage Transfer Service to send data to
BigQuery.
​ C. Stream data to Dataflow, and use Dataprep by Trifacta to send data to
Bigtable.
​ D. Stream data to Dataflow, and use Storage Transfer Service to send data to
BigQuery.

214. You are running out of primary internal IP addresses in a subnet for a custom mode
VPC. The subnet has the IP range 10.0.0.0/20, and the IP addresses are primarily used
by virtual machines in the project. You need to provide more IP addresses for the virtual
machines. What should you do?

​ A. Add a secondary IP range 10.1.0.0/20 to the subnet.


​ B. Change the subnet IP range from 10.0.0.0/20 to 10.0.0.0/18. Most Voted
​ C. Change the subnet IP range from 10.0.0.0/20 to 10.0.0.0/22.
​ D. Convert the subnet IP range from IPv4 to IPv6.

215. Your company requires all developers to have the same permissions, regardless of
the Google Cloud project they are working on. Your company’s security policy also
restricts developer permissions to Compute Engine, Cloud Functions, and Cloud SQL.
You want to implement the security policy with minimal effort. What should you do?

​ A. • Create a custom role with Compute Engine, Cloud Functions, and Cloud
SQL permissions in one project within the Google Cloud organization.
• Copy the role across all projects created within the organization with the gcloud
iam roles copy command.
• Assign the role to developers in those projects.
​ B. • Add all developers to a Google group in Google Groups for Workspace.
• Assign the predefined role of Compute Admin to the Google group at the
Google Cloud organization level.
​ C. • Add all developers to a Google group in Cloud Identity.
• Assign predefined roles for Compute Engine, Cloud Functions, and Cloud SQL
permissions to the Google group for each project in the Google Cloud
organization.
​ D. • Add all developers to a Google group in Cloud Identity.
• Create a custom role with Compute Engine, Cloud Functions, and Cloud
SQL permissions at the Google Cloud organization level.
• Assign the custom role to the Google group. Most Voted

218. Your company runs one batch process in an on-premises server that takes around
30 hours to complete. The task runs monthly, can be performed offline, and must be
restarted if interrupted. You want to migrate this workload to the cloud while minimizing
cost. What should you do?

​ A. Create an Instance Template with Spot VMs On. Create a Managed Instance
Group from the template and adjust Target CPU Utilization. Migrate the workload.
​ B. Migrate the workload to a Compute Engine VM. Start and stop the
instance as needed. Most Voted
​ C. Migrate the workload to a Google Kubernetes Engine cluster with Spot nodes.
​ D. Migrate the workload to a Compute Engine Spot VM.

219. You are planning to migrate the following on-premises data management solutions
to Google Cloud:

• One MySQL cluster for your main database


• Apache Kafka for your event streaming platform
• One Cloud SQL for PostgreSQL database for your analytical and reporting needs

You want to implement Google-recommended solutions for the migration. You need to
ensure that the new solutions provide global scalability and require minimal operational
and infrastructure management. What should you do?

​ A. Migrate from MySQL to Cloud SQL, from Kafka to Pub/Sub, and from Cloud
SQL for PostgreSQL to BigQuery.
​ B. Migrate from MySQL to Cloud Spanner, from Kafka to Pub/Sub, and from
Cloud SQL for PostgreSQL to BigQuery. Most Voted
​ C. Migrate from MySQL to Cloud Spanner, from Kafka to Memorystore, and from
Cloud SQL for PostgreSQL to Cloud SQL.
​ D. Migrate from MySQL to Cloud SQL, from Kafka to Memorystore, and from
Cloud SQL for PostgreSQL to Cloud SQL.

220. During a recent audit of your existing Google Cloud resources, you discovered
several users with email addresses outside of your Google Workspace domain. You
want to ensure that your resources are only shared with users whose email addresses
match your domain. You need to remove any mismatched users, and you want to avoid
having to audit your resources to identify mismatched users. What should you do?

​ A. Create a Cloud Scheduler task to regularly scan your projects and delete
mismatched users.
​ B. Create a Cloud Scheduler task to regularly scan your resources and delete
mismatched users.
​ C. Set an organizational policy constraint to limit identities by domain to
automatically remove mismatched users.
​ D. Set an organizational policy constraint to limit identities by domain, and
then retroactively remove the existing mismatched users Most Voted

221. Your application is running on Google Cloud in a managed instance group (MIG).
You see errors in Cloud Logging for one VM that one of the processes is not responsive.
You want to replace this VM in the MIG quickly. What should you do?

​ A. Use the gcloud compute instances update command with a REFRESH action
for the VM.
​ B. Use the gcloud compute instance-groups managed recreate-instances
command to recreate the VM. Most Voted
​ C. Select the MIG from the Compute Engine console and, in the menu, select
Replace VMs.
​ D. Update and apply the instance template of the MIG.

222. You want to permanently delete a Pub/Sub topic managed by Config Connector in
your Google Cloud project. What should you do?

​ A. Use kubectl to create the label deleted-by-cnrm and to change its value to true
for the topic resource.
​ B. Use kubectl to delete the topic resource. Most Voted
​ C. Use gcloud CLI to delete the topic.
​ D. Use gcloud CLI to update the topic label managed-by-cnrm to false.

223. Your company is using Google Workspace to manage employee accounts.


Anticipated growth will increase the number of personnel from 100 employees to 1,000
employees within 2 years. Most employees will need access to your company’s Google
Cloud account. The systems and processes will need to support 10x growth without
performance degradation, unnecessary complexity, or security issues. What should you
do?

​ A. Migrate the users to Active Directory. Connect the Human Resources system
to Active Directory. Turn on Google Cloud Directory Sync (GCDS) for Cloud
Identity. Turn on Identity Federation from Cloud Identity to Active Directory.
​ B. Organize the users in Cloud Identity into groups. Enforce multi-factor
authentication in Cloud Identity. Most Voted
​ C. Turn on identity federation between Cloud Identity and Google Workspace.
Enforce multi-factor authentication for domain wide delegation. Most Voted
​ D. Use a third-party identity provider service through federation. Synchronize the
users from Google Workplace to the third-party provider in real time.

224. You want to host your video encoding software on Compute Engine. Your user
base is growing rapidly, and users need to be able to encode their videos at any time
without interruption or CPU limitations. You must ensure that your encoding solution is
highly available, and you want to follow Google-recommended practices to automate
operations. What should you do?

​ A. Deploy your solution on multiple standalone Compute Engine instances, and


increase the number of existing instances when CPU utilization on Cloud
Monitoring reaches a certain threshold.
​ B. Deploy your solution on multiple standalone Compute Engine instances, and
replace existing instances with high-CPU instances when CPU utilization on
Cloud Monitoring reaches a certain threshold.
​ C. Deploy your solution to an instance group, and increase the number of
available instances whenever you see high CPU utilization in Cloud Monitoring.
​ D. Deploy your solution to an instance group, and set the autoscaling
based on CPU utilization. Most Voted

225. Your managed instance group raised an alert stating that new instance creation has
failed to create new instances. You need to solve the instance creation problem. What
should you do?

​ A. Create an instance template that contains valid syntax which will be


used by the instance group. Delete any persistent disks with the same
name as instance names. Most Voted
​ B. Create an instance template that contains valid syntax that will be used by the
instance group. Verify that the instance name and persistent disk name values
are not the same in the template.
​ C. Verify that the instance template being used by the instance group contains
valid syntax. Delete any persistent disks with the same name as instance names.
Set the disks.autoDelete property to true in the instance template.
​ D. Delete the current instance template and replace it with a new instance
template. Verify that the instance name and persistent disk name values are not
the same in the template. Set the disks.autoDelete property to true in the
instance template.
228. You have a batch workload that runs every night and uses a large number of virtual
machines (VMs). It is fault-tolerant and can tolerate some of the VMs being terminated.
The current cost of VMs is too high. What should you do?

​ A. Run a test using simulated maintenance events. If the test is successful,


use Spot N2 Standard VMs when running future jobs. Most Voted
​ B. Run a test using simulated maintenance events. If the test is successful, use
N2 Standard VMs when running future jobs.
​ C. Run a test using a managed instance group. If the test is successful, use N2
Standard VMs in the managed instance group when running future jobs.
​ D. Run a test using N1 standard VMs instead of N2. If the test is successful, use
N1 Standard VMs when running future jobs.

229. You created several resources in multiple Google Cloud projects. All projects are
linked to different billing accounts. To better estimate future charges, you want to have a
single visual representation of all costs incurred. You want to include new cost data as
soon as possible. What should you do?

​ A. Fill all resources in the Pricing Calculator to get an estimate of the monthly
cost.
​ B. Use the Reports view in the Cloud Billing Console to view the desired cost
information.
​ C. Visit the Cost Table page to get a CSV export and visualize it using Looker
Studio.
​ D. Configure Billing Data Export to BigQuery and visualize the data in
Looker Studio. Most Voted

232. You recently received a new Google Cloud project with an attached billing account
where you will work. You need to create instances, set firewalls, and store data in Cloud
Storage. You want to follow Google-recommended practices. What should you do?

​ A. Use the gcloud CLI services enable cloudresourcemanager.googleapis.com


command to enable all resources.
​ B. Use the gcloud services enable compute.googleapis.com command to
enable Compute Engine and the gcloud services enable
storage-api.googleapis.com command to enable the Cloud Storage APIs.
Most Voted
​ C. Open the Google Cloud console and enable all Google Cloud APIs from the
API dashboard.
​ D. Open the Google Cloud console and run gcloud init --project in a Cloud Shell.

233. Your application development team has created Docker images for an application
that will be deployed on Google Cloud. Your team does not want to manage the
infrastructure associated with this application. You need to ensure that the application
can scale automatically as it gains popularity. What should you do?

​ A. Create an instance template with the container image, and deploy a Managed
Instance Group with Autoscaling.
​ B. Upload Docker images to Artifact Registry, and deploy the application on
Google Kubernetes Engine using Standard mode.
​ C. Upload Docker images to the Cloud Storage, and deploy the application on
Google Kubernetes Engine using Standard mode.
​ D. Upload Docker images to Artifact Registry, and deploy the application on
Cloud Run. Most Voted

234. You are migrating a business critical application from your local data center into
Google Cloud. As part of your high-availability strategy, you want to ensure that any
data used by the application will be immediately available if a zonal failure occurs. What
should you do?

​ A. Store the application data on a zonal persistent disk. Create a snapshot


schedule for the disk. If an outage occurs, create a new disk from the most
recent snapshot and attach it to a new VM in another zone.
​ B. Store the application data on a zonal persistent disk. If an outage occurs,
create an instance in another zone with this disk attached.
​ C. Store the application data on a regional persistent disk. Create a snapshot
schedule for the disk. If an outage occurs, create a new disk from the most
recent snapshot and attach it to a new VM in another zone.
​ D. Store the application data on a regional persistent disk. If an outage
occurs, create an instance in another zone with this disk attached. Most Voted

235. The DevOps group in your organization needs full control of Compute Engine
resources in your development project. However, they should not have permission to
create or update any other resources in the project. You want to follow Google’s
recommendations for setting permissions for the DevOps group. What should you do?
​ A. Grant the basic role roles/viewer and the predefined role
roles/compute.admin to the DevOps group. Most Voted
​ B. Create an IAM policy and grant all compute.instanceAdmin.* permissions to
the policy. Attach the policy to the DevOps group.
​ C. Create a custom role at the folder level and grant all compute.instanceAdmin.*
permissions to the role. Grant the custom role to the DevOps group.
​ D. Grant the basic role roles/editor to the DevOps group.

Q.266. You are planning to migrate your on-premises data to Google Cloud. The data
includes:

• 200 TB of video files in SAN storage


• Data warehouse data stored on Amazon Redshift
• 20 GB of PNG files stored on an S3 bucket

You need to load the video files into a Cloud Storage bucket, transfer the data
warehouse data into BigQuery, and load the PNG files into a second Cloud Storage
bucket. You want to follow Google-recommended practices and avoid writing any code
for the migration. What should you do?

​ A. Use gcloud storage for the video files, Dataflow for the data warehouse data,
and Storage Transfer Service for the PNG files.
​ B. Use Transfer Appliance for the videos, BigQuery Data Transfer Service
for the data warehouse data, and Storage Transfer Service for the PNG
files. Most Voted
​ C. Use Storage Transfer Service for the video files, BigQuery Data Transfer
Service for the data warehouse data, and Storage Transfer Service for the PNG
files.
​ D. Use Cloud Data Fusion for the video files, Dataflow for the data warehouse
data, and Storage Transfer Service for the PNG files.

Q265. You just installed the Google Cloud CLI on your new corporate laptop. You need
to list the existing instances of your company on Google Cloud. What must you do
before you run the gcloud compute instances list command? (Choose two.)

​ A. Run gcloud auth login, enter your login credentials in the dialog window,
and paste the received login token to gcloud CLI. Most Voted
​ B. Create a Google Cloud service account, and download the service account
key. Place the key file in a folder on your machine where gcloud CLI can find it.
​ C. Download your Cloud Identity user account key. Place the key file in a folder
on your machine where gcloud CLI can find it.
​ D. Run gcloud config set compute/zone $my_zone to set the default zone for
gcloud CLI.
​ E. Run gcloud config set project $my_project to set the default project for gcloud
CLI.

Q264. The core business of your company is to rent out construction equipment at large
scale. All the equipment that is being rented out has been equipped with multiple
sensors that send event information every few seconds. These signals can vary from
engine status, distance traveled, fuel level, and more. Customers are billed based on
the consumption monitored by these sensors. You expect high throughput – up to
thousands of events per hour per device – and need to retrieve consistent data based
on the time of the event. Storing and retrieving individual signals should be atomic.
What should you do?

​ A. Create files in Cloud Storage as data comes in.


​ B. Create a file in Filestore per device, and append new data to that file.
​ C. Ingest the data into Cloud SQL. Use multiple read replicas to match the
throughput.
​ D. Ingest the data into Bigtable. Create a row key based on the event
timestamp. Most Voted

Q263. Your customer wants you to create a secure website with autoscaling based on
the compute instance CPU load. You want to enhance performance by storing static
content in Cloud Storage. Which resources are needed to distribute the user traffic?

​ A. An external HTTP(S) load balancer with a managed SSL certificate to


distribute the load and a URL map to target the requests for the static
content to the Cloud Storage backend. Most Voted
​ B. An external network load balancer pointing to the backend instances to
distribute the load evenly. The web servers will forward the request to the Cloud
Storage as needed.
​ C. An internal HTTP(S) load balancer together with Identity-Aware Proxy to allow
only HTTPS traffic.
​ D. An external HTTP(S) load balancer to distribute the load and a URL map to
target the requests for the static content to the Cloud Storage backend. Install the
HTTPS certificates on the instance.

Q262. You need to extract text from audio files by using the Speech-to-Text API. The
audio files are pushed to a Cloud Storage bucket. You need to implement a fully
managed, serverless compute solution that requires authentication and aligns with
Google-recommended practices. You want to automate the call to the API by submitting
each file to the API as the audio file arrives in the bucket. What should you do?

​ A. Create an App Engine standard environment triggered by Cloud Storage


bucket events to submit the file URI to the Google Speech-to-TextAPI.
​ B. Run a Kubernetes job to scan the bucket regularly for incoming files, and call
the Speech-to-Text API for each unprocessed file.
​ C. Run a Python script by using a Linux cron job in Compute Engine to scan the
bucket regularly for incoming files, and call the Speech-to-Text API for each
unprocessed file.
​ D. Create a Cloud Function triggered by Cloud Storage bucket events to
submit the file URI to the Google Speech-to-Text API. Most Voted

Q261. You are in charge of provisioning access for all Google Cloud users in your
organization. Your company recently acquired a startup company that has their own
Google Cloud organization. You need to ensure that your Site Reliability Engineers
(SREs) have the same project permissions in the startup company's organization as in
your own organization. What should you do?

​ A. In the Google Cloud console for your organization, select Create role from
selection, and choose destination as the startup company's organization.
​ B. In the Google Cloud console for the startup company, select Create role from
selection and choose source as the startup company's Google Cloud
organization.
​ C. Use the gcloud iam roles copy command, and provide the Organization
ID of the startup company's Google Cloud Organization as the destination.
Most Voted

​ D. Use the gcloud iam roles copy command, and provide the project IDs of all
projects in the startup company's organization as the destination.

Q260. You are building a backend service for an ecommerce platform that will persist
transaction data from mobile and web clients. After the platform is launched, you expect
a large volume of global transactions. Your business team wants to run SQL queries to
analyze the data. You need to build a highly available and scalable data store for the
platform. What should you do?

​ A. Create a multi-region Cloud Spanner instance with an optimized schema.


Most Voted

​ B. Create a multi-region Firestore database with aggregation query enabled.


​ C. Create a multi-region Cloud SQL for PostgreSQL database with optimized
indexes.
​ D. Create a multi-region BigQuery dataset with optimized tables
Q259. Your Dataproc cluster runs in a single Virtual Private Cloud (VPC) network in a
single subnetwork with range 172.16.20.128/25. There are no private IP addresses
available in the subnetwork. You want to add new VMs to communicate with your cluster
using the minimum number of steps. What should you do?

​ A. Modify the existing subnet range to 172.16.20.0/24. Most Voted


​ B. Create a new Secondary IP Range in the VPC and configure the VMs to use
that range.
​ C. Create a new VPC network for the VMs. Enable VPC Peering between the
VMs'VPC network and the Dataproc cluster VPC network.
​ D. Create a new VPC network for the VMs with a subnet of 172.32.0.0/16.
Enable VPC network Peering between the Dataproc VPC network and the VMs
VPC network. Configure a custom Route exchange.

Q258. You are configuring service accounts for an application that spans multiple
projects. Virtual machines (VMs) running in the web-applications project need access to
BigQuery datasets in the crm-databases project. You want to follow
Google-recommended practices to grant access to the service account in the
web-applications project. What should you do?

​ A. Grant "project owner" for web-applications appropriate roles to crm-databases.


​ B. Grant "project owner" role to crm-databases and the web-applications project.
​ C. Grant "project owner" role to crm-databases and roles/bigquery.dataViewer
role to web-applications.
​ D. Grant roles/bigquery.dataViewer role to crm-databases and appropriate
roles to web-applications. Most Voted
Q257. After a recent security incident, your startup company wants better insight into
what is happening in the Google Cloud environment. You need to monitor unexpected
firewall changes and instance creation. Your company prefers simple solutions. What
should you do?

​ A. Create a log sink to forward Cloud Audit Logs filtered for firewalls and
compute instances to Cloud Storage. Use BigQuery to periodically analyze log
events in the storage bucket.
​ B. Use Cloud Logging filters to create log-based metrics for firewall and
instance actions. Monitor the changes and set up reasonable alerts. Most
Voted

​ C. Install Kibana on a compute instance. Create a log sink to forward Cloud Audit
Logs filtered for firewalls and compute instances to Pub/Sub. Target the Pub/Sub
topic to push messages to the Kibana instance. Analyze the logs on Kibana in
real time.
​ D. Turn on Google Cloud firewall rules logging, and set up alerts for any insert,
update, or delete events.

Q256. You have deployed an application on a Compute Engine instance. An external


consultant needs to access the Linux-based instance. The consultant is connected to
your corporate network through a VPN connection, but the consultant has no Google
account. What should you do?

​ A. Instruct the external consultant to use the gcloud compute ssh command line
tool by using Identity-Aware Proxy to access the instance.
​ B. Instruct the external consultant to use the gcloud compute ssh command line
tool by using the public IP address of the instance to access it.
​ C. Instruct the external consultant to generate an SSH key pair, and request
the public key from the consultant. Add the public key to the instance
yourself, and have the consultant access the instance through SSH with
their private key. Most Voted
​ D. Instruct the external consultant to generate an SSH key pair, and request the
private key from the consultant. Add the private key to the instance yourself, and
have the consultant access the instance through SSH with their public key.
Q255. You want to enable your development team to deploy new features to an existing
Cloud Run service in production. To minimize the risk associated with a new revision,
you want to reduce the number of customers who might be affected by an outage
without introducing any development or operational costs to your customers. You want
to follow Google-recommended practices for managing revisions to a service. What
should you do?

​ A. Ask your customers to retry access to your service with exponential backoff to
mitigate any potential problems after the new revision is deployed.
​ B. Gradually roll out the new revision and split customer traffic between the
revisions to allow rollback in case a problem occurs. Most Voted
​ C. Send all customer traffic to the new revision, and roll back to a previous
revision if you witness any problems in production.
​ D. Deploy your application to a second Cloud Run service, and ask your
customers to use the second Cloud Run service.

Q254. Your company's security vulnerability management policy wants a member of the
security team to have visibility into vulnerabilities and other OS metadata for a specific
Compute Engine instance. This Compute Engine instance hosts a critical application in
your Google Cloud project. You need to implement your company's security vulnerability
management policy. What should you do?

​ A. • Ensure that the Ops Agent is installed on the Compute Engine instance.
• Create a custom metric in the Cloud Monitoring dashboard.
• Provide the security team member with access to this dashboard.
​ B. • Ensure that the Ops Agent is installed on the Compute Engine instance.
• Provide the security team member roles/osconfig.inventoryViewer permission.
​ C. • Ensure that the OS Config agent is installed on the Compute Engine
instance.
• Provide the security team member
roles/osconfig.vulnerabilityReportViewer permission. Most Voted
​ D. • Ensure that the OS Config agent is installed on the Compute Engine
instance.
• Create a log sink to BigQuery dataset.
• Provide the security team member with access to this dataset.

Q253. You need to deploy an application in Google Cloud using serverless technology.
You want to test a new version of the application with a small percentage of production
traffic. What should you do?

​ A. Deploy the application to Cloud Run. Use gradual rollouts for traffic
splitting. Most Voted
​ B. Deploy the application to Google Kubernetes Engine. Use Anthos Service
Mash for traffic splitting.
​ C. Deploy the application to Cloud Functions. Specify the version number in the
functions name.
​ D. Deploy the application to App Engine. For each new version, create a new
service.

Q252. Your company developed an application to deploy on Google Kubernetes Engine.


Certain parts of the application are not fault-tolerant and are allowed to have downtime.
Other parts of the application are critical and must always be available. You need to
configure a Google Kubernetes Engine cluster while optimizing for cost. What should
you do?

​ A. Create a cluster with a single node-pool by using standard VMs. Label he


fault-tolerant Deployments as spot_true.
​ B. Create a cluster with a single node-pool by using Spot VMs. Label the critical
Deployments as spot_false.
​ C. Create a cluster with both a Spot VM node pool and a node pool by using
standard VMs. Deploy the critical deployments on the Spot VM node pool and
the fault-tolerant deployments on the node pool by using standard VMs.
​ D. Create a cluster with both a Spot VM node pool and a nods pool by
using standard VMs. Deploy the critical deployments on the node pool by
using standard VMs and the fault-tolerant deployments on the Spot VM
node pool. Most Voted
Q251. You installed the Google Cloud CLI on your workstation and set the proxy
configuration. However, you are worried that your proxy credentials will be recorded in
the gcloud CLI logs. You want to prevent your proxy credential from being logged. What
should you do?

​ A. Configure username and password by using gcloud config set


proxy/username and gcloud config set proxy/password commands.
​ B. Encode username and password in sha256 encoding, and save in to a text
file. Use filename as a value in the gcloud config set core/custom_ca_certs_file
command.
​ C. Provide values for CLOUDSDK_PROXY_USERNAME and
CLOUDSDK_PROXY_PASSWORD in the gcloud CLI tool configuration file.
​ D. Set the CLOUDSDK_PROXY_USERNAME and
CLOUDSDK_PROXY_PASSWORD properties by using environment
variables in your command line tool.
Most Voted

Q250. You need to migrate invoice documents stored on-premises to Cloud Storage.
The documents have the following storage requirements:

• Documents must be kept for five years.


• Up to five revisions of the same invoice document must be stored, to allow for
corrections.
• Documents older than 365 days should be moved to lower cost storage tiers.

You want to follow Google-recommended practices to minimize your operational and


development costs. What should you do?

​ A. Enable retention policies on the bucket, and use Cloud Scheduler to invoke a
Cloud Function to move or delete your documents based on their metadata.
​ B. Enable retention policies on the bucket, use lifecycle rules to change the
storage classes of the objects, set the number of versions, and delete old files.
​ C. Enable object versioning on the bucket, and use Cloud Scheduler to invoke a
Cloud Functions instance to move or delete your documents based on their
metadata.
​ D. Enable object versioning on the bucket, use lifecycle conditions to
change the storage class of the objects, set the number of versions, and
delete old files. Most Voted

Q249. Your manager asks you to deploy a workload to a Kubernetes cluster. You are not
sure of the workload's resource requirements or how the requirements might vary
depending on usage patterns, external dependencies, or other factors. You need a
solution that makes cost-effective recommendations regarding CPU and memory
requirements, and allows the workload to function consistently in any situation. You
want to follow Google-recommended practices. What should you do?

​ A. Configure the Horizontal Pod Autoscaler for availability, and configure the
cluster autoscaler for suggestions.
​ B. Configure the Horizontal Pod Autoscaler for availability, and configure
the Vertical Pod Autoscaler recommendations for suggestions. Most Voted
​ C. Configure the Vertical Pod Autoscaler recommendations for availability, and
configure the Cluster autoscaler for suggestions.
​ D. Configure the Vertical Pod Autoscaler recommendations for availability, and
configure the Horizontal Pod Autoscaler for suggestions.

Q248. Your application stores files on Cloud Storage by using the Standard Storage
class. The application only requires access to files created in the last 30 days. You want
to automatically save costs on files that are no longer accessed by the application. What
should you do?

​ A. Create an object lifecycle on the storage bucket to change the storage class to
Archive Storage for objects with an age over 30 days.
​ B. Create a cron job in Cloud Scheduler to call a Cloud Functions instance
every day to delete files older than 30 days.
​ C. Create a retention policy on the storage bucket of 30 days, and lock the
bucket by using a retention policy lock.
​ D. Enable object versioning on the storage bucket and add lifecycle rules to
expire non-current versions after 30 days.
Q247. Your company is moving its continuous integration and delivery (CI/CD) pipeline
to Compute Engine instances. The pipeline will manage the entire cloud infrastructure
through code. How can you ensure that the pipeline has appropriate permissions while
your system is following security best practices?

​ A. • Attach a single service account to the compute instances.


• Add minimal rights to the service account.
• Allow the service account to impersonate a Cloud Identity user with elevated
permissions to create, update, or delete resources.
​ B. • Add a step for human approval to the CI/CD pipeline before the execution of
the infrastructure provisioning.
• Use the human approvals IAM account for the provisioning.
​ C. • Attach a single service account to the compute instances.
• Add all required Identity and Access Management (IAM) permissions to this
service account to create, update, or delete resources.
​ D. • Create multiple service accounts, one for each pipeline with the
appropriate minimal Identity and Access Management (IAM) permissions.
• Use a secret manager service to store the key files of the service
accounts.
• Allow the CI/CD pipeline to request the appropriate secrets during the
execution of the pipeline. Most Voted

Q246. Your company wants to migrate their on-premises workloads to Google Cloud.
The current on-premises workloads consist of:

• A Flask web application


• A backend API
• A scheduled long-running background job for ETL and reporting

You need to keep operational costs low. You want to follow Google-recommended
practices to migrate these workloads to serverless solutions on Google Cloud. What
should you do?

​ A. Migrate the web application to App Engine and the backend API to Cloud Run.
Use Cloud Tasks to run your background job on Compute Engine.
​ B. Migrate the web application to App Engine and the backend API to Cloud
Run. Use Cloud Tasks to run your background job on Cloud Run. Most Voted
​ C. Run the web application on a Cloud Storage bucket and the backend API on
Cloud Run. Use Cloud Tasks to run your background job on Cloud Run.
​ D. Run the web application on a Cloud Storage bucket and the backend API on
Cloud Run. Use Cloud Tasks to run your background job on Compute Engine.

Q245. You want to set up a Google Kubernetes Engine cluster. Verifiable node identity
and integrity are required for the cluster, and nodes cannot be accessed from the
internet. You want to reduce the operational cost of managing your cluster, and you
want to follow Google-recommended practices. What should you do?

​ A. Deploy a private autopilot cluster.


​ B. Deploy a public autopilot cluster.
​ C. Deploy a standard public cluster and enable shielded nodes.
​ D. Deploy a standard private cluster and enable shielded nodes.

Q244. You have a Bigtable instance that consists of three nodes that store personally
identifiable information (PII) data. You need to log all read or write operations, including
any metadata or configuration reads of this database table, in your company’s Security
Information and Event Management (SIEM) system. What should you do?

​ A. • Navigate to Cloud Monitoring in the Google Cloud console, and create a


custom monitoring job for the Bigtable instance to track all changes.
• Create an alert by using webhook endpoints, with the SIEM endpoint as a
receiver.
​ B. • Navigate to the Audit Logs page in the Google Cloud console, and enable
Admin Write logs for the Bigtable instance.
• Create a Cloud Functions instance to export logs from Cloud Logging to your
SIEM.
​ C. • Navigate to the Audit Logs page in the Google Cloud console, and
enable Data Read, Data Write and Admin Read logs for the Bigtable
instance.
• Create a Pub/Sub topic as a Cloud Logging sink destination, and add your
SIEM as a subscriber to the topic.
Most Voted
​ D. • Install the Ops Agent on the Bigtable instance during configuration.
• Create a service account with read permissions for the Bigtable instance.
• Create a custom Dataflow job with this service account to export logs to the
company’s SIEM system.

Q243. Your company is running a three-tier web application on virtual machines that use
a MySQL database. You need to create an estimated total cost of cloud infrastructure to
run this application on Google Cloud instances and Cloud SQL. What should you do?

​ A. Create a Google spreadsheet with multiple Google Cloud resource


combinations. On a separate sheet, import the current Google Cloud prices and
use these prices for the calculations within formulas.
​ B. Use the Google Cloud Pricing Calculator and select the Cloud Operations
template to define your web application with as much detail as possible.
​ C. Implement a similar architecture on Google Cloud, and run a reasonable load
test on a smaller scale. Check the billing information, and calculate the estimated
costs based on the real load your system usually handles.
​ D. Use the Google Cloud Pricing Calculator to determine the cost of every
Google Cloud resource you expect to use. Use similar size instances for
the web server, and use your current on-premises machines as a
comparison for Cloud SQL. Most Voted

Q242. You recently discovered that your developers are using many service account
keys during their development process. While you work on a long term improvement,
you need to quickly implement a process to enforce short-lived service account
credentials in your company. You have the following requirements:

• All service accounts that require a key should be created in a centralized project called
pj-sa.
• Service account keys should only be valid for one day.

You need a Google-recommended solution that minimizes cost. What should you do?
​ A. Implement a Cloud Run job to rotate all service account keys periodically in
pj-sa. Enforce an org policy to deny service account key creation with an
exception to pj-sa.
​ B. Implement a Kubernetes CronJob to rotate all service account keys
periodically. Disable attachment of service accounts to resources in all projects
with an exception to pj-sa.
​ C. Enforce an org policy constraint allowing the lifetime of service account
keys to be 24 hours. Enforce an org policy constraint denying service
account key creation with an exception on pj-sa. Most Voted
​ D. Enforce a DENY org policy constraint over the lifetime of service account keys
for 24 hours. Disable attachment of service accounts to resources in all projects
with an exception to pj-sa.

Q241. You used the gcloud container clusters command to create two Google Cloud
Kubernetes (GKE) clusters: prod-cluster and dev-cluster.

• prod-cluster is a standard cluster.


• dev-cluster is an auto-pilot cluster.

When you run the kubectl get nodes command, you only see the nodes from
prod-cluster. Which commands should you run to check the node status for dev-cluster?

​ A. gcloud container clusters get-credentials dev-cluster


kubectl get nodes
Most Voted

​ B. gcloud container clusters update -generate-password dev-cluster kubectl get


nodes
​ C. kubectl config set-context dev-cluster
kubectl cluster-info
​ D. kubectl config set-credentials dev-cluster
kubectl cluster-info

Q240. You have an on-premises data analytics set of binaries that processes data files
in memory for about 45 minutes every midnight. The sizes of those data files range from
1 gigabyte to 16 gigabytes. You want to migrate this application to Google Cloud with
minimal effort and cost. What should you do?

​ A. Create a container for the set of binaries. Use Cloud Scheduler to start a
Cloud Run job for the container.

​ B. Create a container for the set of binaries. Deploy the container to Google
Kubernetes Engine (GKE) and use the Kubernetes scheduler to start the
application.

​ C. Upload the code to Cloud Functions. Use Cloud Scheduler to start the
application.

​ D. Lift and shift to a VM on Compute Engine. Use an instance schedule to


start and stop the instance.

Q239. You are responsible for a web application on Compute Engine. You want your
support team to be notified automatically if users experience high latency for at least 5
minutes. You need a Google-recommended solution with no development cost. What
should you do?

​ A. Export Cloud Monitoring metrics to BigQuery and use a Looker Studio


dashboard to monitor your web application’s latency.

​ B. Create an alert policy to send a notification when the HTTP response


latency exceeds the specified threshold. Most Voted

​ C. Implement an App Engine service which invokes the Cloud Monitoring API
and sends a notification in case of anomalies.

​ D. Use the Cloud Monitoring dashboard to observe latency and take the
necessary actions when the response latency exceeds the specified threshold.
Q238. You are working in a team that has developed a new application that needs to be
deployed on Kubernetes. The production application is business critical and should be
optimized for reliability. You need to provision a Kubernetes cluster and want to follow
Google-recommended practices. What should you do?

​ A. Create a GKE Autopilot cluster. Enroll the cluster in the rapid release channel.

​ B. Create a GKE Autopilot cluster. Enroll the cluster in the stable release
channel. Most Voted

​ C. Create a zonal GKE standard cluster. Enroll the cluster in the stable release
channel.

​ D. Create a regional GKE standard cluster. Enroll the cluster in the rapid release
channel.

Q237. You are running multiple microservices in a Kubernetes Engine cluster. One
microservice is rendering images. The microservice responsible for the image rendering
requires a large amount of CPU time compared to the memory it requires. The other
microservices are workloads that are optimized for n2-standard machine types. You
need to optimize your cluster so that all workloads are using resources as efficiently as
possible. What should you do?

​ A. Assign the pods of the image rendering microservice a higher pod priority than
the other microservices.

​ B. Create a node pool with compute-optimized machine type nodes for the
image rendering microservice. Use the node pool with general-purpose
machine type nodes for the other microservices. Most Voted

​ C. Use the node pool with general-purpose machine type nodes for the image
rendering microservice. Create a node pool with compute-optimized machine
type nodes for the other microservices.
​ D. Configure the required amount of CPU and memory in the resource requests
specification of the image rendering microservice deployment. Keep the resource
requests for the other microservices at the default.

Q236. Your team is running an on-premises ecommerce application. The application


contains a complex set of microservices written in Python, and each microservice is
running on Docker containers. Configurations are injected by using environment
variables. You need to deploy your current application to a serverless Google Cloud
cloud solution. What should you do?

​ A. Use your existing CI/CD pipeline. Use the generated Docker images and
deploy them to Cloud Run. Update the configurations and the required
endpoints. Most Voted

​ B. Use your existing continuous integration and delivery (CI/CD) pipeline. Use
the generated Docker images and deploy them to Cloud Function. Use the same
configuration as on-premises.

​ C. Use the existing codebase and deploy each service as a separate Cloud
Function. Update the configurations and the required endpoints.

​ D. Use your existing codebase and deploy each service as a separate Cloud
Run. Use the same configurations as on-premises.

Q235. The DevOps group in your organization needs full control of Compute Engine
resources in your development project. However, they should not have permission to
create or update any other resources in the project. You want to follow Google’s
recommendations for setting permissions for the DevOps group. What should you do?

​ A. Grant the basic role roles/viewer and the predefined role


roles/compute.admin to the DevOps group. Most Voted
​ B. Create an IAM policy and grant all compute.instanceAdmin.* permissions to
the policy. Attach the policy to the DevOps group.

​ C. Create a custom role at the folder level and grant all compute.instanceAdmin.*
permissions to the role.

​ D. Grant the custom role to the DevOps group

Q234. You are migrating a business critical application from your local data center into
Google Cloud. As part of your high-availability strategy, you want to ensure that any
data used by the application will be immediately available if a zonal failure occurs. What
should you do?

​ A. Store the application data on a zonal persistent disk. Create a snapshot


schedule for the disk. If an outage occurs, create a new disk from the most
recent snapshot and attach it to a new VM in another zone.

​ B. Store the application data on a zonal persistent disk. If an outage occurs,


create an instance in another zone with this disk attached.

​ C. Store the application data on a regional persistent disk. Create a snapshot


schedule for the disk. If an outage occurs, create a new disk from the most
recent snapshot and attach it to a new VM in another zone.

​ D. Store the application data on a regional persistent disk. If an outage


occurs, create an instance in another zone with this disk attached

Q233. Your application development team has created Docker images for an application
that will be deployed on Google Cloud. Your team does not want to manage the
infrastructure associated with this application. You need to ensure that the application
can scale automatically as it gains popularity. What should you do?

​ A. Create an instance template with the container image, and deploy a Managed
Instance Group with Autoscaling.
​ B. Upload Docker images to Artifact Registry, and deploy the application on
Google Kubernetes Engine using Standard mode.

​ C. Upload Docker images to the Cloud Storage, and deploy the application on
Google Kubernetes Engine using Standard mode.

​ D. Upload Docker images to Artifact Registry, and deploy the application on


Cloud Run. Most Voted

Q232. You recently received a new Google Cloud project with an attached billing
account where you will work. You need to create instances, set firewalls, and store data
in Cloud Storage. You want to follow Google-recommended practices. What should you
do?

​ A. Use the gcloud CLI services enable cloudresourcemanager.googleapis.com


command to enable all resources.

​ B. Use the gcloud services enable compute.googleapis.com command to


enable Compute Engine and the gcloud services enable
storage-api.googleapis.com command to enable the Cloud Storage APIs.
Most Voted

​ C. Open the Google Cloud console and enable all Google Cloud APIs from the
API dashboard.

​ D. Open the Google Cloud console and run gcloud init --project in a Cloud Shell.

Q231. You have deployed an application on a single Compute Engine instance. The
application writes logs to disk. Users start reporting errors with the application. You want
to diagnose the problem. What should you do?

​ A. Navigate to Cloud Logging and view the application logs.

​ B. Configure a health check on the instance and set a “consecutive successes”


Healthy threshold value of 1.
​ C. Connect to the instance’s serial console and read the application logs.

​ D. Install and configure the Ops agent and view the logs from Cloud
Logging. Most Voted

Q230. Your company has a large quantity of unstructured data in different file formats.
You want to perform ETL transformations on the data. You need to make the data
accessible on Google Cloud so it can be processed by a Dataflow job. What should you
do?

​ A. Upload the data to BigQuery using the bq command line tool.

​ B. Upload the data to Cloud Storage using the gcloud storage command.
Most Voted

​ C. Upload the data into Cloud SQL using the import function in the Google Cloud
console.

​ D. Upload the data into Cloud Spanner using the import function in the Google
Cloud console.

Q229. You created several resources in multiple Google Cloud projects. All projects are
linked to different billing accounts. To better estimate future charges, you want to have a
single visual representation of all costs incurred. You want to include new cost data as
soon as possible. What should you do?

​ A. Fill all resources in the Pricing Calculator to get an estimate of the monthly
cost.

​ B. Use the Reports view in the Cloud Billing Console to view the desired cost
information.

​ C. Visit the Cost Table page to get a CSV export and visualize it using Looker
Studio.
​ D. Configure Billing Data Export to BigQuery and visualize the data in
Looker Studio. Most Voted

Q228. You have a batch workload that runs every night and uses a large number of
virtual machines (VMs). It is fault-tolerant and can tolerate some of the VMs being
terminated. The current cost of VMs is too high. What should you do?

​ A. Run a test using simulated maintenance events. If the test is successful,


use Spot N2 Standard VMs when running future jobs. Most Voted

​ B. Run a test using simulated maintenance events. If the test is successful, use
N2 Standard VMs when running future jobs.

​ C. Run a test using a managed instance group. If the test is successful, use N2
Standard VMs in the managed instance group when running future jobs.

​ D. Run a test using N1 standard VMs instead of N2. If the test is successful, use
N1 Standard VMs when running future jobs.

Q227. You are using Looker Studio to visualize a table from your data warehouse that is
built on top of BigQuery. Data is appended to the data warehouse during the day. At
night, the daily summary is recalculated by overwriting the table. You just noticed that
the charts in Looker Studio are broken, and you want to analyze the problem. What
should you do?

​ A. In Cloud Logging, create a filter for your Looker Studio report.

​ B. Use the open source CLI tool, Snapshot Debugger, to find out why the data
was not refreshed correctly.

​ C. Review the Error Reporting page in the Google Cloud console to find any
errors.
​ D. Use the BigQuery interface to review the nightly job and look for any
errors. Most Voted

Q226. You have created an application that is packaged into a Docker image. You want
to deploy the Docker image as a workload on Google Kubernetes Engine. What should
you do?

​ A. Upload the image to Cloud Storage and create a Kubernetes Service


referencing the image.

​ B. Upload the image to Cloud Storage and create a Kubernetes Deployment


referencing the image.

​ C. Upload the image to Artifact Registry and create a Kubernetes Service


referencing the image.

​ D. Upload the image to Artifact Registry and create a Kubernetes


Deployment referencing the image. Most Voted

Q225. Your managed instance group raised an alert stating that new instance creation
has failed to create new instances. You need to solve the instance creation problem.
What should you do?

​ A. Create an instance template that contains valid syntax which will be


used by the instance group. Delete any persistent disks with the same
name as instance names. Most Voted

​ B. Create an instance template that contains valid syntax that will be used by the
instance group. Verify that the instance name and persistent disk name values
are not the same in the template.

​ C. Verify that the instance template being used by the instance group contains
valid syntax. Delete any persistent disks with the same name as instance names.
Set the disks.autoDelete property to true in the instance template.
​ D. Delete the current instance template and replace it with a new instance
template. Verify that the instance name and persistent disk name values are not
the same in the template. Set the disks.autoDelete property to true in the
instance template.

Q224. You want to host your video encoding software on Compute Engine. Your user
base is growing rapidly, and users need to be able to encode their videos at any time
without interruption or CPU limitations. You must ensure that your encoding solution is
highly available, and you want to follow Google-recommended practices to automate
operations. What should you do?

​ A. Deploy your solution on multiple standalone Compute Engine instances, and


increase the number of existing instances when CPU utilization on Cloud
Monitoring reaches a certain threshold.

​ B. Deploy your solution on multiple standalone Compute Engine instances, and


replace existing instances with high-CPU instances when CPU utilization on
Cloud Monitoring reaches a certain threshold.

​ C. Deploy your solution to an instance group, and increase the number of


available instances whenever you see high CPU utilization in Cloud Monitoring.

​ D. Deploy your solution to an instance group, and set the autoscaling


based on CPU utilization. Most Voted

Q223.Your company is using Google Workspace to manage employee accounts.


Anticipated growth will increase the number of personnel from 100 employees to 1,000
employees within 2 years. Most employees will need access to your company’s Google
Cloud account. The systems and processes will need to support 10x growth without
performance degradation, unnecessary complexity, or security issues. What should you
do?
​ A. Migrate the users to Active Directory. Connect the Human Resources system
to Active Directory. Turn on Google Cloud Directory Sync (GCDS) for Cloud
Identity. Turn on Identity Federation from Cloud Identity to Active Directory.

​ B. Organize the users in Cloud Identity into groups. Enforce multi-factor


authentication in Cloud Identity. Most Voted

​ C. Turn on identity federation between Cloud Identity and Google Workspace.


Enforce multi-factor authentication for domain wide delegation.

​ D. Use a third-party identity provider service through federation. Synchronize the


users from Google Workplace to the third-party provider in real time.

Q192. You have created an application that is packaged into a Docker image. You want
to deploy the Docker image as a workload on Google Kubernetes Engine. What should
you do?

​ A. Upload the image to Cloud Storage and create a Kubernetes Service


referencing the image.

​ B. Upload the image to Cloud Storage and create a Kubernetes Deployment


referencing the image.

​ C. Upload the image to Container Registry and create a Kubernetes Service


referencing the image.

​ D. Upload the image to Container Registry and create a Kubernetes


Deployment referencing the image. Most Voted

Q216. You are working for a hospital that stores its medical images in an on-premises
data room. The hospital wants to use Cloud Storage for archival storage of these
images. The hospital wants an automated process to upload any new medical images
to Cloud Storage. You need to design and implement a solution. What should you do?
​ A. Create a Pub/Sub topic, and enable a Cloud Storage trigger for the Pub/Sub
topic. Create an application that sends all medical images to the Pub/Sub topic.

​ B. Create a script that uses the gcloud storage command to synchronize


the on-premises storage with Cloud Storage, Schedule the script as a cron
job. Most Voted

​ C. Create a Pub/Sub topic, and create a Cloud Function connected to the topic
that writes data to Cloud Storage. Create an application that sends all medical
images to the Pub/Sub topic.

​ D. In the Google Cloud console, go to Cloud Storage. Upload the relevant


images to the appropriate bucket.

Q217. Your company has an internal application for managing transactional orders. The
application is used exclusively by employees in a single physical location. The
application requires strong consistency, fast queries, and ACID guarantees for
multi-table transactional updates. The first version of the application is implemented in
PostgreSQL, and you want to deploy it to the cloud with minimal code changes. Which
database is most appropriate for this application?

​ A. Bigtable

​ B. BigQuery

​ C. Cloud SQL Most Voted

​ D. Firestore

Q1.) Every employee of your company has a Google account. Your operational team
needs to manage a large number of instances on Compute Engine. Each member of
this team needs only administrative access to the servers. Your security team wants to
ensure that the deployment of credentials is operationally efficient and must be able to
determine who accessed a given instance. What should you do?

​ A. Generate a new SSH key pair. Give the private key to each member of your
team. Configure the public key in the metadata of each instance.
​ B. Ask each member of the team to generate a new SSH key pair and to send
you their public key. Use a configuration management tool to deploy those keys
on each instance.
​ C. Ask each member of the team to generate a new SSH key pair and to add
the public key to their Google account. Grant the
‫ג‬€compute.osAdminLogin‫ג‬€ role to the Google group corresponding to this
team. Most Voted
​ D. Generate a new SSH key pair. Give the private key to each member of your
team. Configure the public key as a project-wide public SSH key in your Cloud
Platform project and allow project-wide public SSH keys on each instance.

Q2.) You need to create a custom VPC with a single subnet. The subnet's range must
be as large as possible. Which range should you use?

​ A. 0.0.0.0/0
​ B. 10.0.0.0/8 Most Voted
​ C. 172.16.0.0/12
​ D. 192.168.0.0/16

Q3.) You want to select and configure a cost-effective solution for relational data on
Google Cloud Platform. You are working with a small set of operational data in one
geographic location. You need to support point-in-time recovery. What should you do?

​ A. Select Cloud SQL (MySQL). Verify that the enable binary logging option
is selected. Most Voted
​ B. Select Cloud SQL (MySQL). Select the create failover replicas option.
​ C. Select Cloud Spanner. Set up your instance with 2 nodes.
​ D. Select Cloud Spanner. Set up your instance as multi-regional.
Q4.) You want to configure autohealing for network load balancing for a group of
Compute Engine instances that run in multiple zones, using the fewest possible steps.
You need to configure re-creation of VMs if they are unresponsive after 3 attempts of 10
seconds each. What should you do?

​ A. Create an HTTP load balancer with a backend configuration that references


an existing instance group. Set the health check to healthy (HTTP)
​ B. Create an HTTP load balancer with a backend configuration that references
an existing instance group. Define a balancing mode and set the maximum RPS
to 10.
​ C. Create a managed instance group. Set the Autohealing health check to
healthy (HTTP) Most Voted
​ D. Create a managed instance group. Verify that the autoscaling setting is on.

Q5.) You are using multiple configurations for gcloud. You want to review the configured
Kubernetes Engine cluster of an inactive configuration using the fewest possible steps.
What should you do?

​ A. Use gcloud config configurations describe to review the output.


​ B. Use gcloud config configurations activate and gcloud config list to review the
output.
​ C. Use kubectl config get-contexts to review the output.
​ D. Use kubectl config use-context and kubectl config view to review the
output.

Q6.) Your company uses Cloud Storage to store application backup files for disaster
recovery purposes. You want to follow Google's recommended practices. Which storage
option should you use?

​ A. Multi-Regional Storage
​ B. Regional Storage
​ C. Nearline Storage
​ D. Coldline Storage
Q7.)Several employees at your company have been creating projects with Cloud
Platform and paying for it with their personal credit cards, which the company
reimburses. The company wants to centralize all these projects under a single, new
billing account. What should you do?

​ A. Contact cloud-billing@google.com with your bank account details and request


a corporate billing account for your company.
​ B. Create a ticket with Google Support and wait for their call to share your credit
card details over the phone.
​ C. In the Google Platform Console, go to the Resource Manage and move all
projects to the root Organizarion.
​ D. In the Google Cloud Platform Console, create a new billing account and
set up a payment method.

Q8.) You have an application that looks for its licensing server on the IP 10.0.3.21. You
need to deploy the licensing server on Compute Engine. You do not want to change the
configuration of the application and want the application to be able to reach the
licensing server. What should you do?

​ A. Reserve the IP 10.0.3.21 as a static internal IP address using gcloud and


assign it to the licensing server. Most Voted
​ B. Reserve the IP 10.0.3.21 as a static public IP address using gcloud and
assign it to the licensing server.
​ C. Use the IP 10.0.3.21 as a custom ephemeral IP address and assign it to the
licensing server.
​ D. Start the licensing server with an automatic ephemeral IP address, and then
promote it to a static internal IP address.

Q9.) You are deploying an application to App Engine. You want the number of instances
to scale based on request rate. You need at least 3 unoccupied instances at all times.
Which scaling type should you use?
​ A. Manual Scaling with 3 instances.
​ B. Basic Scaling with min_instances set to 3.
​ C. Basic Scaling with max_instances set to 3.
​ D. Automatic Scaling with min_idle_instances set to 3. Most Voted

Q10.) You have a development project with appropriate IAM roles defined. You are
creating a production project and want to have the same IAM roles on the new project,
using the fewest possible steps. What should you do?

​ A. Use gcloud iam roles copy and specify the production project as the
destination project. Most Voted
​ B. Use gcloud iam roles copy and specify your organization as the destination
organization.
​ C. In the Google Cloud Platform Console, use the 'create role from role'
functionality.
​ D. In the Google Cloud Platform Console, use the 'create role' functionality and
select all applicable permissions.

You need a dynamic way of provisioning VMs on Compute Engine. The exact
specifications will be in a dedicated configuration file. You want to follow Google's
recommended practices. Which method should you use?

​ A. Deployment Manager Most Voted


​ B. Cloud Composer
​ C. Managed Instance Group
​ D. Unmanaged Instance Group

You have a Dockerfile that you need to deploy on Kubernetes Engine. What should you
do?

​ A. Use kubectl app deploy <dockerfilename>.


​ B. Use gcloud app deploy <dockerfilename>.
​ C. Create a docker image from the Dockerfile and upload it to Container
Registry. Create a Deployment YAML file to point to that image. Use kubectl
to create the deployment with that file. Most Voted
​ D. Create a docker image from the Dockerfile and upload it to Cloud Storage.
Create a Deployment YAML file to point to that image. Use kubectl to create the
deployment with that file.

Your development team needs a new Jenkins server for their project. You need to
deploy the server using the fewest steps possible. What should you do?

​ A. Download and deploy the Jenkins Java WAR to App Engine Standard.
​ B. Create a new Compute Engine instance and install Jenkins through the
command line interface.
​ C. Create a Kubernetes cluster on Compute Engine and create a deployment
with the Jenkins Docker image.
​ D. Use GCP Marketplace to launch the Jenkins solution. Most Voted

You need to update a deployment in Deployment Manager without any resource


downtime in the deployment. Which command should you use?

​ A. gcloud deployment-manager deployments create --config


<deployment-config-path>
​ B. gcloud deployment-manager deployments update --config
<deployment-config-path> Most Voted
​ C. gcloud deployment-manager resources create --config
<deployment-config-path>
​ D. gcloud deployment-manager resources update --config
<deployment-config-path>

You need to run an important query in BigQuery but expect it to return a lot of records.
You want to find out how much it will cost to run the query. You are using on-demand
pricing. What should you do?

​ A. Arrange to switch to Flat-Rate pricing for this query, then move back to
on-demand.
​ B. Use the command line to run a dry run query to estimate the number of
bytes read. Then convert that bytes estimate to dollars using the Pricing
Calculator. Most Voted
​ C. Use the command line to run a dry run query to estimate the number of bytes
returned. Then convert that bytes estimate to dollars using the Pricing Calculator.
​ D. Run a select count (*) to get an idea of how many records your query will look
​ through. Then convert that number of rows to dollars using the Pricing Calculator.

You have a single binary application that you want to run on Google Cloud Platform.
You decided to automatically scale the application based on underlying infrastructure
CPU usage. Your organizational policies require you to use virtual machines directly.
You need to ensure that the application scaling is operationally efficient and completed
as quickly as possible. What should you do?

​ A. Create a Google Kubernetes Engine cluster, and use horizontal pod


autoscaling to scale the application.
​ B. Create an instance template, and use the template in a managed
instance group with autoscaling configured. Most Voted
​ C. Create an instance template, and use the template in a managed instance
group that scales up and down based on the time of day.
​ D. Use a set of third-party tools to build automation around scaling the
application up and down, based on Stackdriver CPU usage monitoring.

You are analyzing Google Cloud Platform service costs from three separate projects.
You want to use this information to create service cost estimates by service type, daily
and monthly, for the next six months using standard query syntax. What should you do?

​ A. Export your bill to a Cloud Storage bucket, and then import into Cloud Bigtable
for analysis.
​ B. Export your bill to a Cloud Storage bucket, and then import into Google Sheets
for analysis.
​ C. Export your transactions to a local file, and perform analysis with a desktop
tool.
​ D. Export your bill to a BigQuery dataset, and then write time
window-based SQL queries for analysis. Most Voted

You need to set up a policy so that videos stored in a specific Cloud Storage Regional
bucket are moved to Coldline after 90 days, and then deleted after one year from their
creation. How should you set up the policy?

​ A. Use Cloud Storage Object Lifecycle Management using Age conditions with
SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days
and the Delete action to 275 days (365 90 "€‫)ג‬
​ B. Use Cloud Storage Object Lifecycle Management using Age conditions
with SetStorageClass and Delete actions. Set the SetStorageClass action to
90 days and the Delete action to 365 days. Most Voted
​ C. Use gsutil rewrite and set the Delete action to 275 days (365-90).
​ D. Use gsutil rewrite and set the Delete action to 365 days.

You have a Linux VM that must connect to Cloud SQL. You created a service account
with the appropriate access rights. You want to make sure that the VM uses this service
account instead of the default Compute Engine service account. What should you do?

​ A. When creating the VM via the web console, specify the service account
under the 'Identity and API Access' section. Most Voted
​ B. Download a JSON Private Key for the service account. On the Project
Metadata, add that JSON as the value for the key compute-engine-service-
account.
​ C. Download a JSON Private Key for the service account. On the Custom
Metadata of the VM, add that JSON as the value for the key
compute-engine- service-account. Most Voted
​ D. Download a JSON Private Key for the service account. After creating the VM,
ssh into the VM and save the JSON under ~/.gcloud/compute-engine-service-
account.json.

You created an instance of SQL Server 2017 on Compute Engine to test features in the
new version. You want to connect to this instance using the fewest number of steps.
What should you do?
​ A. Install a RDP client on your desktop. Verify that a firewall rule for port 3389
exists.
​ B. Install a RDP client in your desktop. Set a Windows username and password
in the GCP Console. Use the credentials to log in to the instance. Most Voted
​ C. Set a Windows password in the GCP Console. Verify that a firewall rule for
port 22 exists. Click the RDP button in the GCP Console and supply the
credentials to log in.
​ D. Set a Windows username and password in the GCP Console. Verify that
a firewall rule for port 3389 exists. Click the RDP button in the GCP
Console, and supply the credentials to log in. Most Voted

You have one GCP account running in your default region and zone and another
account running in a non-default region and zone. You want to start a new
Compute Engine instance in these two Google Cloud Platform accounts using the
command line interface. What should you do?

​ A. Create two configurations using gcloud config configurations create


[NAME]. Run gcloud config configurations activate [NAME] to switch
between accounts when running the commands to start the Compute
Engine instances. Most Voted
​ B. Create two configurations using gcloud config configurations create [NAME].
Run gcloud configurations list to start the Compute Engine instances.
​ C. Activate two configurations using gcloud configurations activate [NAME]. Run
gcloud config list to start the Compute Engine instances.
​ D. Activate two configurations using gcloud configurations activate [NAME]. Run
gcloud configurations list to start the Compute Engine instances.

You significantly changed a complex Deployment Manager template and want to


confirm that the dependencies of all defined resources are properly met before
committing it to the project. You want the most rapid feedback on your changes. What
should you do?

​ A. Use granular logging statements within a Deployment Manager template


authored in Python.
​ B. Monitor activity of the Deployment Manager execution on the Stackdriver
Logging page of the GCP Console.
​ C. Execute the Deployment Manager template against a separate project with
the same configuration, and monitor for failures.
​ D. Execute the Deployment Manager template using the ‫ג‬€"-preview option
in the same project, and observe the state of interdependent resources.
Most Voted

You are building a pipeline to process time-series data. Which Google Cloud Platform
services should you put in boxes 1,2,3, and 4?

​ A. Cloud Pub/Sub, Cloud Dataflow, Cloud Datastore, BigQuery


​ B. Firebase Messages, Cloud Pub/Sub, Cloud Spanner, BigQuery
​ C. Cloud Pub/Sub, Cloud Storage, BigQuery, Cloud Bigtable
​ D. Cloud Pub/Sub, Cloud Dataflow, Cloud Bigtable, BigQuery Most Voted
You have a project for your App Engine application that serves a development
environment. The required testing has succeeded and you want to create a new project
to serve as your production environment. What should you do?

​ A. Use gcloud to create the new project, and then deploy your application
to the new project. Most Voted
​ B. Use gcloud to create the new project and to copy the deployed application to
the new project.
​ C. Create a Deployment Manager configuration file that copies the current App
Engine deployment into a new project.
​ D. Deploy your application again using gcloud and specify the project parameter
with the new project name to create the new project.

You need to configure IAM access audit logging in BigQuery for external auditors. You
want to follow Google-recommended practices. What should you do?

​ A. Add the auditors group to the 'logging.viewer' and 'bigQuery.dataViewer'


predefined IAM roles. Most Voted
​ B. Add the auditors group to two new custom IAM roles.
​ C. Add the auditor user accounts to the 'logging.viewer' and
'bigQuery.dataViewer' predefined IAM roles.
​ D. Add the auditor user accounts to two new custom IAM roles.

You need to set up permissions for a set of Compute Engine instances to enable them
to write data into a particular Cloud Storage bucket. You want to follow
Google-recommended practices. What should you do?

​ A. Create a service account with an access scope. Use the access scope
'https://www.googleapis.com/auth/devstorage.write_only'.
​ B. Create a service account with an access scope. Use the access scope
'https://www.googleapis.com/auth/cloud-platform'.
​ C. Create a service account and add it to the IAM role
'storage.objectCreator' for that bucket. Most Voted
​ D. Create a service account and add it to the IAM role 'storage.objectAdmin' for
that bucket.

You have sensitive data stored in three Cloud Storage buckets and have enabled data
access logging. You want to verify activities for a particular user for these buckets, using
the fewest possible steps. You need to verify the addition of metadata labels and which
files have been viewed from those buckets. What should you do?

​ A. Using the GCP Console, filter the Activity log to view the information. Most Voted
​ B. Using the GCP Console, filter the Stackdriver log to view the
information. Most Voted
​ C. View the bucket in the Storage section of the GCP Console.
​ D. Create a trace in Stackdriver to view the information.
​ D. Storage Object Creator

You are the project owner of a GCP project and want to delegate control to colleagues
to manage buckets and files in Cloud Storage. You want to follow Google-
recommended practices. Which IAM roles should you grant your colleagues?

​ A. Project Editor
​ B. Storage Admin Most Voted
​ C. Storage Object Admin
​ D. Storage Object Creator

You have an object in a Cloud Storage bucket that you want to share with an external
company. The object contains sensitive data. You want access to the content to be
removed after four hours. The external company does not have a Google account to
which you can grant specific user-based access privileges. You want to use the most
secure method that requires the fewest steps. What should you do?

​ A. Create a signed URL with a four-hour expiration and share the URL with
the company. Most Voted
​ B. Set object access to 'public' and use object lifecycle management to remove
the object after four hours.
​ C. Configure the storage bucket as a static website and furnish the object's URL
to the company. Delete the object from the storage bucket after four hours.
​ D. Create a new Cloud Storage bucket specifically for the external company to
access. Copy the object to that bucket. Delete the bucket after four hours have
passed.

You are creating a Google Kubernetes Engine (GKE) cluster with a cluster autoscaler
feature enabled. You need to make sure that each node of the cluster will run a
monitoring pod that sends container metrics to a third-party monitoring solution. What
should you do?

​ A. Deploy the monitoring pod in a StatefulSet object.


​ B. Deploy the monitoring pod in a DaemonSet object. Most Voted
​ C. Reference the monitoring pod in a Deployment object.
​ D. Reference the monitoring pod in a cluster initializer at the GKE cluster creation
time.

You want to send and consume Cloud Pub/Sub messages from your App Engine
application. The Cloud Pub/Sub API is currently disabled. You will use a service account
to authenticate your application to the API. You want to make sure your application can
use Cloud Pub/Sub. What should you do?

​ A. Enable the Cloud Pub/Sub API in the API Library on the GCP Console.
Most Voted

​ B. Rely on the automatic enablement of the Cloud Pub/Sub API when the
Service Account accesses it.
​ C. Use Deployment Manager to deploy your application. Rely on the automatic
enablement of all APIs used by the application being deployed.
​ D. Grant the App Engine Default service account the role of Cloud Pub/Sub
Admin. Have your application enable the API on the first connection to Cloud
Pub/ Sub.
You need to monitor resources that are distributed over different projects in Google
Cloud Platform. You want to consolidate reporting under the same Stackdriver
Monitoring dashboard. What should you do?

​ A. Use Shared VPC to connect all projects, and link Stackdriver to one of the
projects.
​ B. For each project, create a Stackdriver account. In each project, create a
service account for that project and grant it the role of Stackdriver Account Editor
in all other projects.
​ C. Configure a single Stackdriver account, and link all projects to the same
account. Most Voted
​ D. Configure a single Stackdriver account for one of the projects. In Stackdriver,
create a Group and add the other project names as criteria for that Group.

You are deploying an application to a Compute Engine VM in a managed instance


group. The application must be running at all times, but only a single instance of the VM
should run per GCP project. How should you configure the instance group?

​ A. Set autoscaling to On, set the minimum number of instances to 1, and then set
the maximum number of instances to 1. Most Voted
​ B. Set autoscaling to Off, set the minimum number of instances to 1, and
then set the maximum number of instances to 1. Most Voted
​ C. Set autoscaling to On, set the minimum number of instances to 1, and then
set the maximum number of instances to 2.
​ D. Set autoscaling to Off, set the minimum number of instances to 1, and then
set the maximum number of instances to 2.

You want to verify the IAM users and roles assigned within a GCP project named
my-project. What should you do?

​ A. Run gcloud iam roles list. Review the output section.


​ B. Run gcloud iam service-accounts list. Review the output section.
​ C. Navigate to the project and then to the IAM section in the GCP Console.
Review the members and roles. Most Voted
​ D. Navigate to the project and then to the Roles section in the GCP Console.
Review the roles and status.

You need to create a new billing account and then link it with an existing Google Cloud
Platform project. What should you do?

​ A. Verify that you are Project Billing Manager for the GCP project. Update the
existing project to link it to the existing billing account. Most Voted
​ B. Verify that you are Project Billing Manager for the GCP project. Create a
new billing account and link the new billing account to the existing project.
Most Voted

​ C. Verify that you are Billing Administrator for the billing account. Create a new
project and link the new project to the existing billing account.
​ D. Verify that you are Billing Administrator for the billing account. Update the
existing project to link it to the existing billing account.

You have one project called proj-sa where you manage all your service accounts. You
want to be able to use a service account from this project to take snapshots of VMs
running in another project called proj-vm. What should you do?

​ A. Download the private key from the service account, and add it to each VMs
custom metadata.
​ B. Download the private key from the service account, and add the private key to
each VM's SSH keys.
​ C. Grant the service account the IAM Role of Compute Storage Admin in
the project called proj-vm. Most Voted
​ D. When creating the VMs, set the service account's API scope for Compute
Engine to read/write.

You created a Google Cloud Platform project with an App Engine application inside the
project. You initially configured the application to be served from the us- central region.
Now you want the application to be served from the asia-northeast1 region. What
should you do?
​ A. Change the default region property setting in the existing GCP project to
asia-northeast1.
​ B. Change the region property setting in the existing App Engine application from
us-central to asia-northeast1.
​ C. Create a second App Engine application in the existing GCP project and
specify asia-northeast1 as the region to serve your application.
​ D. Create a new GCP project and create an App Engine application inside
this new project. Specify asia-northeast1 as the region to serve your
application. Most Voted

You need to grant access for three users so that they can view and edit table data on a
Cloud Spanner instance. What should you do?

​ A. Run gcloud iam roles describe roles/spanner.databaseUser. Add the users to


the role.
​ B. Run gcloud iam roles describe roles/spanner.databaseUser. Add the
users to a new group. Add the group to the role. Most Voted
​ C. Run gcloud iam roles describe roles/spanner.viewer - -project my-project. Add
the users to the role.
​ D. Run gcloud iam roles describe roles/spanner.viewer - -project my-project. Add
the users to a new group. Add the group to the role.

You create a new Google Kubernetes Engine (GKE) cluster and want to make sure that
it always runs a supported and stable version of Kubernetes. What should you do?

​ A. Enable the Node Auto-Repair feature for your GKE cluster.


​ B. Enable the Node Auto-Upgrades feature for your GKE cluster. Most Voted
​ C. Select the latest available cluster version for your GKE cluster.
​ D. Select ‫ג‬€Container-Optimized OS (cos)‫ג‬€ as a node image for your GKE
cluster.

You have an instance group that you want to load balance. You want the load balancer
to terminate the client SSL session. The instance group is used to serve a public web
application over HTTPS. You want to follow Google-recommended practices. What
should you do?

​ A. Configure an HTTP(S) load balancer. Most Voted


​ B. Configure an internal TCP load balancer.
​ C. Configure an external SSL proxy load balancer.
​ D. Configure an external TCP proxy load balancer.

Hide Solution Di

Q166. Your web application has been running successfully on Cloud Run for Anthos.
You want to evaluate an updated version of the application with a specific percentage of
your production users (canary deployment). What should you do?

​ A. Create a new service with the new version of the application. Split traffic
between this version and the version that is currently running.
​ B. Create a new revision with the new version of the application. Split traffic
between this version and the version that is currently running. Most Voted
​ C. Create a new service with the new version of the application. Add an HTTP
Load Balancer in front of both services.
​ D. Create a new revision with the new version of the application. Add an HTTP
Load Balancer in front of both revisions.

Q165. You are running a data warehouse on BigQuery. A partner company is offering a
recommendation engine based on the data in your data warehouse. The partner
company is also running their application on Google Cloud. They manage the resources
in their own project, but they need access to the BigQuery dataset in your project. You
want to provide the partner company with access to the dataset. What should you do?
​ A. Create a Service Account in your own project, and grant this Service Account
access to BigQuery in your project.
​ B. Create a Service Account in your own project, and ask the partner to grant this
Service Account access to BigQuery in their project.
​ C. Ask the partner to create a Service Account in their project, and have them
give the Service Account access to BigQuery in their project.
​ D. Ask the partner to create a Service Account in their project, and grant
their Service Account access to the BigQuery dataset in your project. Most
Voted

Q164. You have experimented with Google Cloud using your own credit card and
expensed the costs to your company. Your company wants to streamline the billing
process and charge the costs of your projects to their monthly invoice. What should you
do?

​ A. Grant the financial team the IAM role of ‫ג‬€Billing Account User‫ג‬€ on the billing
account linked to your credit card.
​ B. Set up BigQuery billing export and grant your financial department IAM access
to query the data.
​ C. Create a ticket with Google Billing Support to ask them to send the invoice to
your company.
​ D. Change the billing account of your projects to the billing account of your
company. Most Voted

Q163.You have developed a containerized web application that will serve internal
colleagues during business hours. You want to ensure that no costs are incurred outside
of the hours the application is used. You have just created a new Google Cloud project
and want to deploy the application. What should you do?

​ A. Deploy the container on Cloud Run for Anthos, and set the minimum number
of instances to zero.
​ B. Deploy the container on Cloud Run (fully managed), and set the
minimum number of instances to zero. Most Voted
​ C. Deploy the container on App Engine flexible environment with autoscaling,
and set the value min_instances to zero in the app.yaml.
​ D. Deploy the container on App Engine flexible environment with manual scaling,
and set the value instances to zero in the app.yaml.

Q162. Your company has embraced a hybrid cloud strategy where some of the
applications are deployed on Google Cloud. A Virtual Private Network (VPN) tunnel
connects your Virtual Private Cloud (VPC) in Google Cloud with your company's
on-premises network. Multiple applications in Google Cloud need to connect to an
on-premises database server, and you want to avoid having to change the IP
configuration in all of your applications when the IP of the database changes.
What should you do?

​ A. Configure Cloud NAT for all subnets of your VPC to be used when egressing
from the VM instances.
​ B. Create a private zone on Cloud DNS, and configure the applications with
the DNS name. Most Voted
​ C. Configure the IP of the database as custom metadata for each instance, and
query the metadata server.
​ D. Query the Compute Engine internal DNS from the applications to retrieve the
IP of the database.

Q161.You are performing a monthly security check of your Google Cloud environment
and want to know who has access to view data stored in your Google Cloud
Project. What should you do?

​ A. Enable Audit Logs for all APIs that are related to data storage.
​ B. Review the IAM permissions for any role that allows for data access. Most
Voted

​ C. Review the Identity-Aware Proxy settings for each resource.


​ D. Create a Data Loss Prevention job.
Q160. Your company wants to standardize the creation and management of multiple
Google Cloud resources using Infrastructure as Code. You want to minimize the amount
of repetitive code needed to manage the environment. What should you do?

​ A. Develop templates for the environment using Cloud Deployment


Manager. Most Voted
​ B. Use curl in a terminal to send a REST request to the relevant Google API for
each individual resource.
​ C. Use the Cloud Console interface to provision and manage all related
resources.
​ D. Create a bash script that contains all requirement steps as gcloud commands.

Q159.You are building an application that processes data files uploaded from thousands
of suppliers. Your primary goals for the application are data security and the expiration
of aged data. You need to design the application to:
* Restrict access so that suppliers can access only their own data.
* Give suppliers write access to data only for 30 minutes.
* Delete data that is over 45 days old.
You have a very short development cycle, and you need to make sure that the
application requires minimal maintenance. Which two strategies should you use?
(Choose two.)

​ A. Build a lifecycle policy to delete Cloud Storage objects after 45 days. Most
Voted

​ B. Use signed URLs to allow suppliers limited time access to store their
objects. Most Voted
​ C. Set up an SFTP server for your application, and create a separate user for
each supplier.
​ D. Build a Cloud function that triggers a timer of 45 days to delete objects that
have expired.
​ E. Develop a script that loops through all Cloud Storage buckets and deletes any
buckets that are older than 45 days.
Q158. Your company uses a large number of Google Cloud services centralized in a
single project. All teams have specific projects for testing and development. The
DevOps team needs access to all of the production services in order to perform their
job. You want to prevent Google Cloud product changes from broadening their
permissions in the future. You want to follow Google-recommended practices. What
should you do?

​ A. Grant all members of the DevOps team the role of Project Editor on the
organization level.
​ B. Grant all members of the DevOps team the role of Project Editor on the
production project.
​ C. Create a custom role that combines the required permissions. Grant the
DevOps team the custom role on the production project. Most Voted
​ D. Create a custom role that combines the required permissions. Grant the
DevOps team the custom role on the organization level.

Q157. You are deploying a production application on Compute Engine. You want to
prevent anyone from accidentally destroying the instance by clicking the wrong button.
What should you do?

​ A. Disable the flag ‫ג‬€Delete boot disk when instance is deleted.‫ג‬€


​ B. Enable delete protection on the instance. Most Voted
​ C. Disable Automatic restart on the instance.
​ D. Enable Preemptibility on the instance.

Q156. You are the team lead of a group of 10 developers. You provided each developer
with an individual Google Cloud Project that they can use as their personal sandbox to
experiment with different Google Cloud solutions. You want to be notified if any of the
developers are spending above $500 per month on their sandbox environment. What
should you do?

​ A. Create a single budget for all projects and configure budget alerts on this
budget.
​ B. Create a separate billing account per sandbox project and enable BigQuery
billing exports. Create a Data Studio dashboard to plot the spending per billing
account.
​ C. Create a budget per project and configure budget alerts on all of these
budgets. Most Voted
​ D. Create a single billing account for all sandbox projects and enable BigQuery
billing exports. Create a Data Studio dashboard to plot the spending per project.

Q155. You are storing sensitive information in a Cloud Storage bucket. For legal
reasons, you need to be able to record all requests that read any of the stored data. You
want to make sure you comply with these requirements. What should you do?

​ A. Enable the Identity Aware Proxy API on the project.


​ B. Scan the bucket using the Data Loss Prevention API.
​ C. Allow only a single Service Account access to read the data.
​ D. Enable Data Access audit logs for the Cloud Storage API. Most Voted

Q154.You have been asked to set up Object Lifecycle Management for objects stored in
storage buckets. The objects are written once and accessed frequently for 30 days.
After 30 days, the objects are not read again unless there is a special need. The objects
should be kept for three years, and you need to minimize cost.
What should you do?

​ A. Set up a policy that uses Nearline storage for 30 days and then moves to
Archive storage for three years.
​ B. Set up a policy that uses Standard storage for 30 days and then moves
to Archive storage for three years. Most Voted
​ C. Set up a policy that uses Nearline storage for 30 days, then moves the
Coldline for one year, and then moves to Archive storage for two years.
​ D. Set up a policy that uses Standard storage for 30 days, then moves to
Coldline for one year, and then moves to Archive storage for two years.
Q153.You have created a code snippet that should be triggered whenever a new file is
uploaded to a Cloud Storage bucket. You want to deploy this code snippet. What should
you do?

​ A. Use App Engine and configure Cloud Scheduler to trigger the application
using Pub/Sub.
​ B. Use Cloud Functions and configure the bucket as a trigger resource. Most
Voted

​ C. Use Google Kubernetes Engine and configure a CronJob to trigger the


application using Pub/Sub.
​ D. Use Dataflow as a batch job, and configure the bucket as a data source.

Q152. Your company runs its Linux workloads on Compute Engine instances. Your
company will be working with a new operations partner that does not use Google
Accounts. You need to grant access to the instances to your operations partner so they
can maintain the installed tooling. What should you do?

​ A. Enable Cloud IAP for the Compute Engine instances, and add the
operations partner as a Cloud IAP Tunnel User. Most Voted
​ B. Tag all the instances with the same network tag. Create a firewall rule in the
VPC to grant TCP access on port 22 for traffic from the operations partner to
instances with the network tag.
​ C. Set up Cloud VPN between your Google Cloud VPC and the internal network
of the operations partner.
​ D. Ask the operations partner to generate SSH key pairs, and add the public
keys to the VM instances.

Q151.You are working with a user to set up an application in a new VPC behind a
firewall. The user is concerned about data egress. You want to configure the fewest
open egress ports. What should you do?

​ A. Set up a low-priority (65534) rule that blocks all egress and a


high-priority rule (1000) that allows only the appropriate ports. Most Voted
​ B. Set up a high-priority (1000) rule that pairs both ingress and egress ports.
​ C. Set up a high-priority (1000) rule that blocks all egress and a low-priority
(65534) rule that allows only the appropriate ports.
​ D. Set up a high-priority (1000) rule to allow the appropriate ports.

Q150. You have a batch workload that runs every night and uses a large number of
virtual machines (VMs). It is fault-tolerant and can tolerate some of the VMs being
terminated. The current cost of VMs is too high. What should you do?

​ A. Run a test using simulated maintenance events. If the test is successful,


use preemptible N1 Standard VMs when running future jobs. Most Voted
​ B. Run a test using simulated maintenance events. If the test is successful, use
N1 Standard VMs when running future jobs.
​ C. Run a test using a managed instance group. If the test is successful, use N1
Standard VMs in the managed instance group when running future jobs.
​ D. Run a test using N1 standard VMs instead of N2. If the test is successful, use
N1 Standard VMs when running future jobs

Q149. You are running multiple VPC-native Google Kubernetes Engine clusters in the
same subnet. The IPs available for the nodes are exhausted, and you want to ensure
that the clusters can grow in nodes when needed. What should you do?

​ A. Create a new subnet in the same region as the subnet being used.
​ B. Add an alias IP range to the subnet used by the GKE clusters.
​ C. Create a new VPC, and set up VPC peering with the existing VPC.
​ D. Expand the CIDR range of the relevant subnet for the cluster. Most Voted

Q148.A colleague handed over a Google Cloud Platform project for you to maintain. As
part of a security checkup, you want to review who has been granted the Project
Owner role. What should you do?

​ A. In the console, validate which SSH keys have been stored as project-wide
keys.
​ B. Navigate to Identity-Aware Proxy and check the permissions for these
resources.
​ C. Enable Audit Logs on the IAM & admin page for all resources, and validate the
results.
​ D. Use the command gcloud projects get‫ג‬€"iam‫ג‬€"policy to view the
current role assignments. Most Voted

Q147. You are hosting an application from Compute Engine virtual machines (VMs) in
us`"central1`"a. You want to adjust your design to support the failure of a single
Compute Engine zone, eliminate downtime, and minimize cost. What should you do?

​ A. ‫ג‬€" Create Compute Engine resources in us‫ג‬€"central1‫ג‬€"b. ‫ג‬€" Balance


the load across both us‫ג‬€"central1‫ג‬€"a and us‫ג‬€"central1‫ג‬€"b. Most Voted
​ B. ‫ג‬€" Create a Managed Instance Group and specify us‫ג‬€"central1‫ג‬€"a as the
zone. ‫ג‬€" Configure the Health Check with a short Health Interval.
​ C. ‫ג‬€" Create an HTTP(S) Load Balancer. ‫ג‬€" Create one or more global
forwarding rules to direct traffic to your VMs.
​ D. ‫ג‬€" Perform regular backups of your application. ‫ג‬€" Create a Cloud
Monitoring Alert and be notified if your application becomes unavailable. ‫ג‬€"
Restore from backups when notified.

Q146. You have an application that uses Cloud Spanner as a database backend to
keep current state information about users. Cloud Bigtable logs all events triggered by
users. You export Cloud Spanner data to Cloud Storage during daily backups. One of
your analysts asks you to join data from Cloud Spanner and Cloud
Bigtable for specific users. You want to complete this ad hoc request as efficiently as
possible. What should you do?

​ A. Create a dataflow job that copies data from Cloud Bigtable and Cloud Storage
for specific users.
​ B. Create a dataflow job that copies data from Cloud Bigtable and Cloud Spanner
for specific users.
​ C. Create a Cloud Dataproc cluster that runs a Spark job to extract data from
Cloud Bigtable and Cloud Storage for specific users.
​ D. Create two separate BigQuery external tables on Cloud Storage and
Cloud Bigtable. Use the BigQuery console to join these tables through user

Q145. Your existing application running in Google Kubernetes Engine (GKE) consists of
multiple pods running on four GKE n1`"standard`"2 nodes. You need to deploy
additional pods requiring n2`"highmem`"16 nodes without any downtime. What should
you do?

​ A. Use gcloud container clusters upgrade. Deploy the new services.


​ B. Create a new Node Pool and specify machine type n2‫ג‬€"highmem16"€‫ג‬.
Deploy the new pods. Most Voted
​ C. Create a new cluster with n2‫ג‬€"highmem16"€‫ ג‬nodes. Redeploy the pods and
delete the old cluster.
​ D. Create a new cluster with both n1‫ג‬€"standard2"€‫ ג‬and n2‫ג‬€"highmem16"€‫ג‬
nodes. Redeploy the pods and delete the old cluster.

Q144. You have a Compute Engine instance hosting an application used between 9 AM
and 6 PM on weekdays. You want to back up this instance daily for disaster recovery
purposes. You want to keep the backups for 30 days. You want the
Google-recommended solution with the least management overhead and the least
number of services. What should you do?

​ A. 1. Update your instances' metadata to add the following value:


snapshot‫ג‬€"schedule: 0 1 * * * 2. Update your instances' metadata to add the
following value: snapshot‫ג‬€"retention: 30
​ B. 1. In the Cloud Console, go to the Compute Engine Disks page and
select your instance's disk. 2. In the Snapshot Schedule section, select
Create Schedule and configure the following parameters: - Schedule
frequency: Daily - Start time: 1:00 AM 2:00 "€‫ ג‬AM - Autodelete snapshots
after: 30 days Most Voted
​ C. 1. Create a Cloud Function that creates a snapshot of your instance's disk. 2.
Create a Cloud Function that deletes snapshots that are older than 30 days. 3.
Use Cloud Scheduler to trigger both Cloud Functions daily at 1:00 AM.
​ D. 1. Create a bash script in the instance that copies the content of the disk to
Cloud Storage. 2. Create a bash script in the instance that deletes data older
than 30 days in the backup Cloud Storage bucket. 3. Configure the instance's
crontab to execute these scripts daily at 1:00 AM.

Q143. Your team maintains the infrastructure for your organization. The current
infrastructure requires changes. You need to share your proposed changes with the rest
of the team. You want to follow Google's recommended best practices. What should you
do?

​ A. Use Deployment Manager templates to describe the proposed changes and


store them in a Cloud Storage bucket.
​ B. Use Deployment Manager templates to describe the proposed changes
and store them in Cloud Source Repositories. Most Voted
​ C. Apply the changes in a development environment, run gcloud compute
instances list, and then save the output in a shared Storage bucket.
​ D. Apply the changes in a development environment, run gcloud compute
instances list, and then save the output in Cloud Source Repositories.

Q142.Your company has workloads running on Compute Engine and on-premises. The
Google Cloud Virtual Private Cloud (VPC) is connected to your WAN over a
Virtual Private Network (VPN). You need to deploy a new Compute Engine instance and
ensure that no public Internet traffic can be routed to it. What should you do?

​ A. Create the instance without a public IP address. Most Voted


​ B. Create the instance with Private Google Access enabled.
​ C. Create a deny-all egress firewall rule on the VPC network.
​ D. Create a route on the VPC to route all traffic to the instance over the VPN
tunnel.

Q141. You created several resources in multiple Google Cloud projects. All projects are
linked to different billing accounts. To better estimate future charges, you want to have a
single visual representation of all costs incurred. You want to include new cost data as
soon as possible. What should you do?
​ A. Configure Billing Data Export to BigQuery and visualize the data in Data
Studio. Most Voted
​ B. Visit the Cost Table page to get a CSV export and visualize it using Data
Studio.
​ C. Fill all resources in the Pricing Calculator to get an estimate of the monthly
cost.
​ D. Use the Reports view in the Cloud Billing Console to view the desired cost
information.

Q140. You are asked to set up application performance monitoring on Google Cloud
projects A, B, and C as a single pane of glass. You want to monitor CPU, memory, and
disk. What should you do?

​ A. Enable API and then share charts from project A, B, and C.


​ B. Enable API and then give the metrics.reader role to projects A, B, and C.
​ C. Enable API and then use default dashboards to view all projects in sequence.
​ D. Enable API, create a workspace under project A, and then add projects B
and C. Most Voted

Q139. The core business of your company is to rent out construction equipment at large scale.
All the equipment that is being rented out has been equipped with multiple sensors that send
event information every few seconds. These signals can vary from engine status, distance
traveled, fuel level, and more. Customers are billed based on the consumption monitored by
these sensors. You expect high throughput `" up to thousands of events per hour per device `"
and need to retrieve consistent data based on the time of the event. Storing and retrieving
individual signals should be atomic. What should you do?
A. Create a file in Cloud Storage per device and append new data to that file.
B. Create a file in Cloud Filestore per device and append new data to that file.
C. Ingest the data into Datastore. Store data in an entity group based on the device.
D. Ingest the data into Cloud Bigtable. Create a row key based on the event timestamp.
Most Voted

Q138.You have downloaded and installed the gcloud command line interface (CLI) and
have authenticated with your Google Account. Most of your Compute Engine instances
in your project run in the europe-west1-d zone. You want to avoid having to specify this
zone with each CLI command when managing these instances.
What should you do?

​ A. Set the europe-west1-d zone as the default zone using the gcloud config
subcommand. Most Voted
​ B. In the Settings page for Compute Engine under Default location, set the zone
to europe‫ג‬€"west1-d.
​ C. In the CLI installation directory, create a file called default.conf containing
zone=europe‫ג‬€"west1‫ג‬€"d.
​ D. Create a Metadata entry on the Compute Engine page with key compute/zone
and value europe‫ג‬€"west1‫ג‬€"d.

Q137. You are developing a new application and are looking for a Jenkins installation to
build and deploy your source code. You want to automate the installation as quickly and
easily as possible. What should you do?

​ A. Deploy Jenkins through the Google Cloud Marketplace. Most Voted


​ B. Create a new Compute Engine instance. Run the Jenkins executable.
​ C. Create a new Kubernetes Engine cluster. Create a deployment for the Jenkins
image.
​ D. Create an instance template with the Jenkins executable. Create a managed
instance group with this template.

Q136. Your company runs one batch process in an on-premises server that takes
around 30 hours to complete. The task runs monthly, can be performed offline, and
must be restarted if interrupted. You want to migrate this workload to the cloud while
minimizing cost. What should you do?

​ A. Migrate the workload to a Compute Engine Preemptible VM.


​ B. Migrate the workload to a Google Kubernetes Engine cluster with Preemptible
nodes.
​ C. Migrate the workload to a Compute Engine VM. Start and stop the
instance as needed. Most Voted
​ D. Create an Instance Template with Preemptible VMs On. Create a Managed
Instance Group from the template and adjust Target CPU Utilization. Migrate the
workload.

Q135. You need to create a Compute Engine instance in a new project that doesn't exist
yet. What should you do?

​ A. Using the Cloud SDK, create a new project, enable the Compute Engine
API in that project, and then create the instance specifying your new
project. Most Voted
​ B. Enable the Compute Engine API in the Cloud Console, use the Cloud SDK to
create the instance, and then use the --project flag to specify a new project.
​ C. Using the Cloud SDK, create the new instance, and use the --project flag to
specify the new project. Answer yes when prompted by Cloud SDK to enable the
Compute Engine API.
​ D. Enable the Compute Engine API in the Cloud Console. Go to the Compute
Engine section of the Console to create a new instance, and look for the Create
In A New Project option in the creation form.

Q134. You built an application on your development laptop that uses Google Cloud
services. Your application uses Application Default Credentials for authentication and
works fine on your development laptop. You want to migrate this application to a
Compute Engine virtual machine (VM) and set up authentication using Google-
recommended practices and minimal changes. What should you do?

​ A. Assign appropriate access for Google services to the service account used by
the Compute Engine VM. Most Voted
​ B. Create a service account with appropriate access for Google services,
and configure the application to use this account. Most Voted
​ C. Store credentials for service accounts with appropriate access for Google
services in a config file, and deploy this config file with your application.
​ D. Store credentials for your user account with appropriate access for Google
services in a config file, and deploy this config file with your application.

Q133. An application generates daily reports in a Compute Engine virtual machine (VM).
The VM is in the project corp-iot-insights. Your team operates only in the project
corp-aggregate-reports and needs a copy of the daily exports in the bucket
corp-aggregate-reports-storage. You want to configure access so that the daily reports
from the VM are available in the bucket corp-aggregate-reports-storage and use as few
steps as possible while following Google-recommended practices. What should you do?

​ A. Move both projects under the same folder.

​ B. Grant the VM Service Account the role Storage Object Creator on


corp-aggregate-reports-storage. Most Voted

​ C. Create a Shared VPC network between both projects. Grant the VM Service
Account the role Storage Object Creator on corp-iot-insights.

​ D. Make corp-aggregate-reports-storage public and create a folder with a


pseudo-randomized suffix name. Share the folder with the IoT team.

Q132.You have deployed an application on a single Compute Engine instance. The


application writes logs to disk. Users start reporting errors with the application. You want
to diagnose the problem. What should you do?

​ A. Navigate to Cloud Logging and view the application logs.


​ B. Connect to the instance's serial console and read the application logs.
​ C. Configure a Health Check on the instance and set a Low Healthy Threshold
value.
​ D. Install and configure the Cloud Logging Agent and view the logs from
Cloud Logging. Most Voted
Q131. You need to create a copy of a custom Compute Engine virtual machine (VM) to
facilitate an expected increase in application traffic due to a business acquisition.
What should you do?

​ A. Create a Compute Engine snapshot of your base VM. Create your images
from that snapshot.
​ B. Create a Compute Engine snapshot of your base VM. Create your instances
from that snapshot. Most Voted
​ C. Create a custom Compute Engine image from a snapshot. Create your
images from that image.
​ D. Create a custom Compute Engine image from a snapshot. Create your
instances from that image. Most Voted

Q130. You manage an App Engine Service that aggregates and visualizes data from
BigQuery. The application is deployed with the default App Engine Service account.
The data that needs to be visualized resides in a different project managed by another
team. You do not have access to this project, but you want your application to be able to
read data from the BigQuery dataset. What should you do?

​ A. Ask the other team to grant your default App Engine Service account the role
of BigQuery Job User. Most Voted
​ B. Ask the other team to grant your default App Engine Service account the
role of BigQuery Data Viewer. Most Voted
​ C. In Cloud IAM of your project, ensure that the default App Engine service
account has the role of BigQuery Data Viewer.
​ D. In Cloud IAM of your project, grant a newly created service account from the
other team the role of BigQuery Job User in your project.

Q129.Your Dataproc cluster runs in a single Virtual Private Cloud (VPC) network in a
single subnet with range 172.16.20.128/25. There are no private IP addresses available
in the VPC network. You want to add new VMs to communicate with your cluster using
the minimum number of steps. What should you do?
​ A. Modify the existing subnet range to 172.16.20.0/24. Most Voted
​ B. Create a new Secondary IP Range in the VPC and configure the VMs to use
that range.
​ C. Create a new VPC network for the VMs. Enable VPC Peering between the
VMs' VPC network and the Dataproc cluster VPC network. Most Voted
​ D. Create a new VPC network for the VMs with a subnet of 172.32.0.0/16.
Enable VPC network Peering between the Dataproc VPC network and the VMs
VPC network. Configure a custom Route exchange.

Q128.You have an application on a general-purpose Compute Engine instance that is


experiencing excessive disk read throttling on its Zonal SSD Persistent Disk. The
application primarily reads large files from disk. The disk size is currently 350 GB. You
want to provide the maximum amount of throughput while minimizing costs.
What should you do?

​ A. Increase the size of the disk to 1 TB.


​ B. Increase the allocated CPU to the instance.
​ C. Migrate to use a Local SSD on the instance. Most Voted
​ D. Migrate to use a Regional SSD on the instance.

Q127.You have an application that receives SSL-encrypted TCP traffic on port 443.
Clients for this application are located all over the world. You want to minimize latency
for the clients. Which load balancing option should you use?

​ A. HTTPS Load Balancer


​ B. Network Load Balancer
​ C. SSL Proxy Load Balancer Most Voted
​ D. Internal TCP/UDP Load Balancer. Add a firewall rule allowing ingress traffic
from 0.0.0.0/0 on the target instances.
Q126.You have designed a solution on Google Cloud that uses multiple Google Cloud
products. Your company has asked you to estimate the costs of the solution. You need
to provide estimates for the monthly total cost. What should you do?

​ A. For each Google Cloud product in the solution, review the pricing details
on the products pricing page. Use the pricing calculator to total the
monthly costs for each Google Cloud product. Most Voted
​ B. For each Google Cloud product in the solution, review the pricing details on
the products pricing page. Create a Google Sheet that summarizes the expected
monthly costs for each product.
​ C. Provision the solution on Google Cloud. Leave the solution provisioned for 1
week. Navigate to the Billing Report page in the Cloud Console. Multiply the 1
week cost to determine the monthly costs.
​ D. Provision the solution on Google Cloud. Leave the solution provisioned for 1
week. Use Cloud Monitoring to determine the provisioned and used resource
amounts. Multiply the 1 week cost to determine the monthly costs.

Q125. Your company publishes large files on an Apache web server that runs on a
Compute Engine instance. The Apache web server is not the only application running in
the project. You want to receive an email when the egress network costs for the server
exceed 100 dollars for the current month as measured by Google Cloud.
What should you do?

​ A. Set up a budget alert on the project with an amount of 100 dollars, a threshold
of 100%, and notification type of ‫ג‬€email.‫ג‬€
​ B. Set up a budget alert on the billing account with an amount of 100 dollars, a
threshold of 100%, and notification type of ‫ג‬€email.‫ג‬€
​ C. Export the billing data to BigQuery. Create a Cloud Function that uses
BigQuery to sum the egress network costs of the exported billing data for
the Apache web server for the current month and sends an email if it is
over 100 dollars. Schedule the Cloud Function using Cloud Scheduler to
run hourly. Most Voted
​ D. Use the Cloud Logging Agent to export the Apache web server logs to Cloud
Logging. Create a Cloud Function that uses BigQuery to parse the HTTP
response log data in Cloud Logging for the current month and sends an email if
the size of all HTTP responses, multiplied by current Google Cloud egress
prices, totals over 100 dollars. Schedule the Cloud Function using Cloud
Scheduler to run hourly.

Q124. You have an application that uses Cloud Spanner as a backend database. The
application has a very predictable traffic pattern. You want to automatically scale up or
down the number of Spanner nodes depending on traffic. What should you do?

​ A. Create a cron job that runs on a scheduled basis to review Cloud Monitoring
metrics, and then resize the Spanner instance accordingly.

​ B. Create a Cloud Monitoring alerting policy to send an alert to oncall SRE emails
when Cloud Spanner CPU exceeds the threshold. SREs would scale resources
up or down accordingly.

​ C. Create a Cloud Monitoring alerting policy to send an alert to Google Cloud


Support email when Cloud Spanner CPU exceeds your threshold. Google
support would scale resources up or down accordingly.

​ D. Create a Cloud Monitoring alerting policy to send an alert to webhook


when Cloud Spanner CPU is over or under your threshold. Create a Cloud
Function that listens to HTTP and resizes Spanner resources accordingly.
Most Voted

Q123. You have a Compute Engine instance hosting a production application. You want
to receive an email if the instance consumes more than 90% of its CPU resources for
more than 15 minutes. You want to use Google services. What should you do?

​ A. 1. Create a consumer Gmail account. 2. Write a script that monitors the CPU
usage. 3. When the CPU usage exceeds the threshold, have that script send an
email using the Gmail account and smtp.gmail.com on port 25 as SMTP server.
​ B. 1. Create a Cloud Monitoring Workspace and associate your Google
Cloud Platform (GCP) project with it. 2. Create a Cloud Monitoring Alerting
Policy that uses the threshold as a trigger condition. 3. Configure your
email address in the notification channel. Most Voted

​ C. 1. Create a Cloud Monitoring Workspace and associate your GCP project with
it. 2. Write a script that monitors the CPU usage and sends it as a custom metric
to Cloud Monitoring. 3. Create an uptime check for the instance in Cloud
Monitoring.

​ D. 1. In Cloud Logging, create a logs-based metric to extract the CPU usage by


using this regular expression: CPU Usage: ([0-9] {1,3})% 2. In Cloud Monitoring,
create an Alerting Policy based on this metric. 3. Configure your email address in
the notification channel.

Q122. Your company is moving from an on-premises environment to Google Cloud. You
have multiple development teams that use Cassandra environments as backend
databases. They all need a development environment that is isolated from other
Cassandra instances. You want to move to Google Cloud quickly and with minimal
support effort. What should you do?

​ A. 1. Build an instruction guide to install Cassandra on Google Cloud. 2. Make


the instruction guide accessible to your developers.

​ B. 1. Advise your developers to go to Cloud Marketplace. 2. Ask the


developers to launch a Cassandra image for their development work. Most
Voted

​ C. 1. Build a Cassandra Compute Engine instance and take a snapshot of it. 2.


Use the snapshot to create instances for your developers.

​ D. 1. Build a Cassandra Compute Engine instance and take a snapshot of it. 2.


Upload the snapshot to Cloud Storage and make it accessible to your
developers. 3. Build instructions to create a Compute Engine instance from the
snapshot so that developers can do it themselves.

Q121.Your managed instance group raised an alert stating that new instance creation
has failed to create new instances. You need to maintain the number of running
instances specified by the template to be able to process expected application traffic.
What should you do?

​ A. Create an instance template that contains valid syntax which will be


used by the instance group. Delete any persistent disks with the same
name as instance names. Most Voted

​ B. Create an instance template that contains valid syntax that will be used by the
instance group. Verify that the instance name and persistent disk name values
are not the same in the template.

​ C. Verify that the instance template being used by the instance group contains
valid syntax. Delete any persistent disks with the same name as instance names.
Set the disks.autoDelete property to true in the instance template.

​ D. Delete the current instance template and replace it with a new instance
template. Verify that the instance name and persistent disk name values are not
the same in the template. Set the disks.autoDelete property to true in the
instance template.

Q120. You need to manage multiple Google Cloud projects in the fewest steps possible.
You want to configure the Google Cloud SDK command line interface (CLI) so that you
can easily manage multiple projects. What should you do?
​ A. 1. Create a configuration for each project you need to manage. 2.
Activate the appropriate configuration when you work with each of your
assigned Google Cloud projects. Most Voted

​ B. 1. Create a configuration for each project you need to manage. 2. Use gcloud
init to update the configuration values when you need to work with a non-default
project

​ C. 1. Use the default configuration for one project you need to manage. 2.
Activate the appropriate configuration when you work with each of your assigned
Google Cloud projects.

​ D. 1. Use the default configuration for one project you need to manage. 2. Use
gcloud init to update the configuration values when you need to work with a
non-default project.

Q119. Your company has a large quantity of unstructured data in different file formats.
You want to perform ETL transformations on the data. You need to make the data
accessible on Google Cloud so it can be processed by a Dataflow job. What should you
do?

​ A. Upload the data to BigQuery using the bq command line tool.

​ B. Upload the data to Cloud Storage using the gsutil command line tool.
Most Voted

​ C. Upload the data into Cloud SQL using the import function in the console.

​ D. Upload the data into Cloud Spanner using the import function in the console.
Q118. You need to create a custom IAM role for use with a GCP service. All
permissions in the role must be suitable for production use. You also want to clearly
share with your organization the status of the custom role. This will be the first version
of the custom role. What should you do?

​ A. Use permissions in your role that use the 'supported' support level for
role permissions. Set the role stage to ALPHA while testing the role
permissions. Most Voted

​ B. Use permissions in your role that use the 'supported' support level for role
permissions. Set the role stage to BETA while testing the role permissions.

​ C. Use permissions in your role that use the 'testing' support level for role
permissions. Set the role stage to ALPHA while testing the role permissions.

​ D. Use permissions in your role that use the 'testing' support level for role
permissions. Set the role stage to BETA while testing the role permissions.

Q117. An employee was terminated, but their access to Google Cloud was not removed
until 2 weeks later. You need to find out if this employee accessed any sensitive
customer information after their termination. What should you do?

​ A. View System Event Logs in Cloud Logging. Search for the user's email as the
principal.

​ B. View System Event Logs in Cloud Logging. Search for the service account
associated with the user.

​ C. View Data Access audit logs in Cloud Logging. Search for the user's
email as the principal. Most Voted

​ D. View the Admin Activity log in Cloud Logging. Search for the service account
associated with the user.
Q116. You are configuring service accounts for an application that spans multiple
projects. Virtual machines (VMs) running in the web-applications project need access to
BigQuery datasets in crm-databases-proj. You want to follow Google-recommended
practices to give access to the service account in the web-applications project. What
should you do?

​ A. Give ‫ג‬€project owner‫ג‬€ for web-applications appropriate roles to


crm-databases-proj.

​ B. Give ‫ג‬€project owner‫ג‬€ role to crm-databases-proj and the web-applications


project.

​ C. Give ‫ג‬€project owner‫ג‬€ role to crm-databases-proj and bigquery.dataViewer


role to web-applications.

​ D. Give bigquery.dataViewer role to crm-databases-proj and appropriate


roles to web-applications. Most Voted

Q115. You need to reduce GCP service costs for a division of your company using the
fewest possible steps. You need to turn off all configured services in an existing
GCP project. What should you do?

​ A. 1. Verify that you are assigned the Project Owners IAM role for this
project. 2. Locate the project in the GCP console, click Shut down and then
enter the project ID. Most Voted

​ B. 1. Verify that you are assigned the Project Owners IAM role for this project. 2.
Switch to the project in the GCP console, locate the resources and delete them.

​ C. 1. Verify that you are assigned the Organizational Administrator IAM role for
this project. 2. Locate the project in the GCP console, enter the project ID and
then click Shut down.
​ D. 1. Verify that you are assigned the Organizational Administrators IAM role for
this project. 2. Switch to the project in the GCP console, locate the resources and
delete them.

Q114. You are managing several Google Cloud Platform (GCP) projects and need
access to all logs for the past 60 days. You want to be able to explore and quickly
analyze the log contents. You want to follow Google-recommended practices to obtain
the combined logs for all projects. What should you do?

​ A. Navigate to Stackdriver Logging and select resource.labels.project_id="*"

​ B. Create a Stackdriver Logging Export with a Sink destination to a


BigQuery dataset. Configure the table expiration to 60 days. Most Voted

​ C. Create a Stackdriver Logging Export with a Sink destination to Cloud Storage.


Create a lifecycle rule to delete objects after 60 days.

​ D. Configure a Cloud Scheduler job to read from Stackdriver and store the logs in
BigQuery. Configure the table expiration to 60 days.

Q113. You need to assign a Cloud Identity and Access Management (Cloud IAM) role to
an external auditor. The auditor needs to have permissions to review your
Google Cloud Platform (GCP) Audit Logs and also to review your Data Access logs.
What should you do?

​ A. Assign the auditor the IAM role roles/logging.privateLogViewer. Perform the


export of logs to Cloud Storage.

​ B. Assign the auditor the IAM role roles/logging.privateLogViewer. Direct


the auditor to also review the logs for changes to Cloud IAM policy. Most
Voted
​ C. Assign the auditor's IAM user to a custom role that has
logging.privateLogEntries.list permission. Perform the export of logs to Cloud
Storage.

​ D. Assign the auditor's IAM user to a custom role that has


logging.privateLogEntries.list permission. Direct the auditor to also review the
logs for changes to Cloud IAM policy.

Q112You have a workload running on Compute Engine that is critical to your business.
You want to ensure that the data on the boot disk of this workload is backed up
regularly. You need to be able to restore a backup as quickly as possible in case of
disaster. You also want older backups to be cleaned automatically to save on cost. You
want to follow Google-recommended practices. What should you do?

​ A. Create a Cloud Function to create an instance template.

​ B. Create a snapshot schedule for the disk using the desired interval. Most
Voted

​ C. Create a cron job to create a new disk from the disk using gcloud.

​ D. Create a Cloud Task to create an image and export it to Cloud Storage.

Q111.Your management has asked an external auditor to review all the resources in a
specific project. The security team has enabled the Organization Policy called
Domain Restricted Sharing on the organization node by specifying only your Cloud
Identity domain. You want the auditor to only be able to view, but not modify, the
resources in that project. What should you do?

​ A. Ask the auditor for their Google account, and give them the Viewer role on the
project.
​ B. Ask the auditor for their Google account, and give them the Security Reviewer
role on the project.

​ C. Create a temporary account for the auditor in Cloud Identity, and give
that account the Viewer role on the project. Most Voted

​ D. Create a temporary account for the auditor in Cloud Identity, and give that
account the Security Reviewer role on the project.

Q110. You have successfully created a development environment in a project for an


application. This application uses Compute Engine and Cloud SQL. Now you need to
create a production environment for this application. The security team has forbidden
the existence of network routes between these 2 environments and has asked you to
follow Google-recommended practices. What should you do?

​ A. Create a new project, enable the Compute Engine and Cloud SQL APIs in
that project, and replicate the setup you have created in the development
environment. Most Voted

​ B. Create a new production subnet in the existing VPC and a new production
Cloud SQL instance in your existing project, and deploy your application using
those resources.

​ C. Create a new project, modify your existing VPC to be a Shared VPC, share
that VPC with your new project, and replicate the setup you have in the
development environment in that new project in the Shared VPC.

​ D. Ask the security team to grant you the Project Editor role in an existing
production project used by another division of your company. Once they grant
you that role, replicate the setup you have in the development environment in
that project.
Q109. Your organization has user identities in Active Directory. Your organization wants
to use Active Directory as their source of truth for identities. Your organization wants to
have full control over the Google accounts used by employees for all Google services,
including your Google Cloud Platform (GCP) organization. What should you do?

​ A. Use Google Cloud Directory Sync (GCDS) to synchronize users into Cloud
Identity. Most Voted

​ B. Use the cloud Identity APIs and write a script to synchronize users to Cloud
Identity.

​ C. Export users from Active Directory as a CSV and import them to Cloud Identity
via the Admin Console.

​ D. Ask each employee to create a Google account using self signup. Require
that each employee use their company email address and password.

Q108. A team of data scientists infrequently needs to use a Google Kubernetes Engine
(GKE) cluster that you manage. They require GPUs for some long-running, non-
restartable jobs. You want to minimize cost. What should you do?

​ A. Enable node auto-provisioning on the GKE cluster. Most Voted

​ B. Create a VerticalPodAutscaler for those workloads.

​ C. Create a node pool with preemptible VMs and GPUs attached to those VMs.

​ D. Create a node pool of instances with GPUs, and enable autoscaling on


this node pool with a minimum size of 1. Most Voted

Q107. You are building an archival solution for your data warehouse and have selected
Cloud Storage to archive your data. Your users need to be able to access this archived
data once a quarter for some regulatory requirements. You want to select a
cost-efficient option. Which storage option should you use?

​ A. Cold Storage Most Voted

​ B. Nearline Storage Most Voted

​ C. Regional Storage

​ D. Multi-Regional Storage

Q106. Your organization has a dedicated person who creates and manages all service
accounts for Google Cloud projects. You need to assign this person the minimum role
for projects. What should you do?

​ A. Add the user to roles/iam.roleAdmin role.

​ B. Add the user to roles/iam.securityAdmin role.

​ C. Add the user to roles/iam.serviceAccountUser role.

​ D. Add the user to roles/iam.serviceAccountAdmin role. Most Voted

Q105. Your company has a single sign-on (SSO) identity provider that supports Security
Assertion Markup Language (SAML) integration with service providers. Your company
has users in Cloud Identity. You would like users to authenticate using your company's
SSO provider. What should you do?

​ A. In Cloud Identity, set up SSO with Google as an identity provider to access


custom SAML apps.

​ B. In Cloud Identity, set up SSO with a third-party identity provider with


Google as a service provider. Most Voted
​ C. Obtain OAuth 2.0 credentials, configure the user consent screen, and set up
OAuth 2.0 for Mobile & Desktop Apps.

​ D. Obtain OAuth 2.0 credentials, configure the user consent screen, and set up
OAuth 2.0 for Web Server Applications.

Q104. Your company set up a complex organizational structure on Google Cloud. The
structure includes hundreds of folders and projects. Only a few team members should
be able to view the hierarchical structure. You need to assign minimum permissions to
these team members, and you want to follow Google-recommended practices. What
should you do?

​ A. Add the users to roles/browser role.

​ B. Add the users to roles/iam.roleViewer role.

​ C. Add the users to a group, and add this group to roles/browser. Most Voted

​ D. Add the users to a group, and add this group to roles/iam.roleViewer role.

Q103. You have a developer laptop with the Cloud SDK installed on Ubuntu. The Cloud
SDK was installed from the Google Cloud Ubuntu package repository. You want to test
your application locally on your laptop with Cloud Datastore. What should you do?

​ A. Export Cloud Datastore data using gcloud datastore export.

​ B. Create a Cloud Datastore index using gcloud datastore indexes create.

​ C. Install the google-cloud-sdk-datastore-emulator component using the apt get


install command.

​ D. Install the cloud-datastore-emulator component using the gcloud


components install command. Most Voted
Q102. Your organization needs to grant users access to query datasets in BigQuery but
prevent them from accidentally deleting the datasets. You want a solution that follows
Google-recommended practices. What should you do?

​ A. Add users to roles/bigquery user role only, instead of roles/bigquery


dataOwner. Most Voted

​ B. Add users to roles/bigquery dataEditor role only, instead of roles/bigquery


dataOwner.

​ C. Create a custom role by removing delete permissions, and add users to that
role only.

​ D. Create a custom role by removing delete permissions. Add users to the


group, and then add the group to the custom role. Most Voted

Q101. You need to host an application on a Compute Engine instance in a project


shared with other teams. You want to prevent the other teams from accidentally causing
downtime on that application. Which feature should you use?

​ A. Use a Shielded VM.

​ B. Use a Preemptible VM.

​ C. Use a sole-tenant node.

​ D. Enable deletion protection on the instance.


Q100. Your organization has strict requirements to control access to Google Cloud
projects. You need to enable your Site Reliability Engineers (SREs) to approve requests
from the Google Cloud support team when an SRE opens a support case. You want to
follow Google-recommended practices. What should you do?

​ A. Add your SREs to roles/iam.roleAdmin role.

​ B. Add your SREs to roles/accessapproval.approver role.

​ C. Add your SREs to a group and then add this group to


roles/iam.roleAdmin.role.

​ D. Add your SREs to a group and then add this group to


roles/accessapproval.approver role. Most Voted

Q99. Your finance team wants to view the billing report for your projects. You want to
make sure that the finance team does not get additional permissions to the project.
What should you do?

​ A. Add the group for the finance team to roles/billing user role.

​ B. Add the group for the finance team to roles/billing admin role.

​ C. Add the group for the finance team to roles/billing viewer role. Most Voted

​ D. Add the group for the finance team to roles/billing project/Manager role.

Q98. Your customer has implemented a solution that uses Cloud Spanner and notices
some read latency-related performance issues on one table. This table is accessed only
by their users using a primary key. The table schema is shown below.
You want to resolve the issue. What should you do?

​ A. Remove the profile_picture field from the table.

​ B. Add a secondary index on the person_id column.

​ C. Change the primary key to not have monotonically increasing values.


Most Voted

​ D. Create a secondary index using the following Data Definition Language (DDL):

Most Voted

Q97. You are building a product on top of Google Kubernetes Engine (GKE). You have
a single GKE cluster. For each of your customers, a Pod is running in that cluster, and
your customers can run arbitrary code inside their Pod. You want to maximize the
isolation between your customers' Pods. What should you do?

​ A. Use Binary Authorization and whitelist only the container images used by your
customers' Pods.
​ B. Use the Container Analysis API to detect vulnerabilities in the containers used
by your customers' Pods.

​ C. Create a GKE node pool with a sandbox type configured to gvisor. Add
the parameter runtimeClassName: gvisor to the specification of your
customers' Pods. Most Voted

​ D. Use the cos_containerd image for your GKE nodes. Add a nodeSelector with
the value cloud.google.com/gke-os-distribution: cos_containerd to the
specification of your customers' Pods.

Q96.Your company implemented BigQuery as an enterprise data warehouse. Users


from multiple business units run queries on this data warehouse. However, you notice
that query costs for BigQuery are very high, and you need to control costs. Which two
methods should you use? (Choose two.)

​ A. Split the users from business units to multiple projects.

​ B. Apply a user- or project-level custom query quota for BigQuery data


warehouse. Most Voted

​ C. Create separate copies of your BigQuery data warehouse for each business
unit.

​ D. Split your BigQuery data warehouse into multiple data warehouses for each
business unit. Most Voted

​ E. Change your BigQuery query model from on-demand to flat rate. Apply
the appropriate number of slots to each Project. Most Voted

Q95. You want to find out when users were added to Cloud Spanner Identity Access
Management (IAM) roles on your Google Cloud Platform (GCP) project. What should
you do in the GCP Console?

​ A. Open the Cloud Spanner console to review configurations.


​ B. Open the IAM & admin console to review IAM policies for Cloud Spanner
roles.

​ C. Go to the Stackdriver Monitoring console and review information for Cloud


Spanner.

​ D. Go to the Stackdriver Logging console, review admin activity logs, and


filter them for Cloud Spanner IAM roles. Most Voted

Q94. You create a Deployment with 2 replicas in a Google Kubernetes Engine cluster
that has a single preemptible node pool. After a few minutes, you use kubectl to
examine the status of your Pod and observe that one of them is still in Pending status:

What is the most likely cause?

​ A. The pending Pod's resource requests are too large to fit on a single node of
the cluster.

​ B. Too many Pods are already running in the cluster, and there are not
enough resources left to schedule the pending Pod. Most Voted

​ C. The node pool is configured with a service account that does not have
permission to pull the container image used by the pending Pod.

​ D. The pending Pod was originally scheduled on a node that has been
preempted between the creation of the Deployment and your verification of the
Pods' status. It is currently being rescheduled on a new node. Most Voted
Q93. Your company uses BigQuery for data warehousing. Over time, many different
business units in your company have created 1000+ datasets across hundreds of
projects. Your CIO wants you to examine all datasets to find tables that contain an
employee_ssn column. You want to minimize effort in performing this task.
What should you do?

​ A. Go to Data Catalog and search for employee_ssn in the search box. Most
Voted

​ B. Write a shell script that uses the bq command line tool to loop through all the
projects in your organization.

​ C. Write a script that loops through all the projects in your organization and runs
a query on INFORMATION_SCHEMA.COLUMNS view to find the employee_ssn
column.

​ D. Write a Cloud Dataflow job that loops through all the projects in your
organization and runs a query on INFORMATION_SCHEMA.COLUMNS view to
find employee_ssn column.

Q92.You want to select and configure a solution for storing and archiving data on
Google Cloud Platform. You need to support compliance objectives for data from one
geographic location. This data is archived after 30 days and needs to be accessed
annually. What should you do?

​ A. Select Multi-Regional Storage. Add a bucket lifecycle rule that archives data
after 30 days to Coldline Storage.

​ B. Select Multi-Regional Storage. Add a bucket lifecycle rule that archives data
after 30 days to Nearline Storage.

​ C. Select Regional Storage. Add a bucket lifecycle rule that archives data after
30 days to Nearline Storage.
​ D. Select Regional Storage. Add a bucket lifecycle rule that archives data
after 30 days to Coldline Storage. Most Vot

Q91. Your company's infrastructure is on-premises, but all machines are running at
maximum capacity. You want to burst to Google Cloud. The workloads on Google
Cloud must be able to directly communicate to the workloads on-premises using a
private IP range. What should you do?

​ A. In Google Cloud, configure the VPC as a host for Shared VPC.

​ B. In Google Cloud, configure the VPC for VPC Network Peering.

​ C. Create bastion hosts both in your on-premises environment and on Google


Cloud. Configure both as proxy servers using their public IP addresses.

​ D. Set up Cloud VPN between the infrastructure on-premises and Google


Cloud. Most Voted

Q90. You want to configure a solution for archiving data in a Cloud Storage bucket. The
solution must be cost-effective. Data with multiple versions should be archived after 30
days. Previous versions are accessed once a month for reporting. This archive data is
also occasionally updated at month-end. What should you do?

​ A. Add a bucket lifecycle rule that archives data with newer versions after 30
days to Coldline Storage.

​ B. Add a bucket lifecycle rule that archives data with newer versions after 30
days to Nearline Storage. Most Voted

​ C. Add a bucket lifecycle rule that archives data from regional storage after 30
days to Coldline Storage.
​ D. Add a bucket lifecycle rule that archives data from regional storage after 30
days to Nearline Storage.

Q89.You are migrating a production-critical on-premises application that requires 96


vCPUs to perform its task. You want to make sure the application runs in a similar
environment on GCP. What should you do?

​ A. When creating the VM, use machine type n1-standard-96. Most Voted

​ B. When creating the VM, use Intel Skylake as the CPU platform.

​ C. Create the VM using Compute Engine default settings. Use gcloud to modify
the running instance to have 96 vCPUs.

​ D. Start the VM using Compute Engine default settings, and adjust as you go
based on Rightsizing Recommendations.

Q88. You need to set a budget alert for use of Compute Engineer services on one of the
three Google Cloud Platform projects that you manage. All three projects are linked to a
single billing account. What should you do?

​ A. Verify that you are the project billing administrator. Select the associated
billing account and create a budget and alert for the appropriate project.
Most Voted

​ B. Verify that you are the project billing administrator. Select the associated billing
account and create a budget and a custom alert.

​ C. Verify that you are the project administrator. Select the associated billing
account and create a budget for the appropriate project.
​ D. Verify that you are project administrator. Select the associated billing account
and create a budget and a custom alert.

Q87. You deployed an LDAP server on Compute Engine that is reachable via TLS
through port 636 using UDP. You want to make sure it is reachable by clients over that
port. What should you do?

​ A. Add the network tag allow-udp-636 to the VM instance running the LDAP
server.

​ B. Create a route called allow-udp-636 and set the next hop to be the VM
instance running the LDAP server.

​ C. Add a network tag of your choice to the instance. Create a firewall rule to
allow ingress on UDP port 636 for that network tag. Most Voted

​ D. Add a network tag of your choice to the instance running the LDAP server.
Create a firewall rule to allow egress on UDP port 636 for that network tag.

Q86. You need to verify that a Google Cloud Platform service account was created at a
particular time. What should you do?

​ A. Filter the Activity log to view the Configuration category. Filter the
Resource type to Service Account. Most Voted

​ B. Filter the Activity log to view the Configuration category. Filter the Resource
type to Google Project.

​ C. Filter the Activity log to view the Data Access category. Filter the Resource
type to Service Account.

​ D. Filter the Activity log to view the Data Access category. Filter the Resource
type to Google Project.
Q85.You have a large 5-TB AVRO file stored in a Cloud Storage bucket. Your analysts
are proficient only in SQL and need access to the data stored in this file. You want to
find a cost-effective way to complete their request as soon as possible. What should
you do?

​ A. Load data in Cloud Datastore and run a SQL query against it.

​ B. Create a BigQuery table and load data in BigQuery. Run a SQL query on this
table and drop this table after you complete your request.

​ C. Create external tables in BigQuery that point to Cloud Storage buckets


and run a SQL query on these external tables to complete your request.
Most Voted

​ D. Create a Hadoop cluster and copy the AVRO file to NDFS by compressing it.
Load the file in a hive table and provide access to your analysts so that they can
run SQL queries.

Q84.You have a Google Cloud Platform account with access to both production and
development projects. You need to create an automated process to list all compute
instances in development and production projects on a daily basis. What should you
do?

​ A. Create two configurations using gcloud config. Write a script that sets
configurations as active, individually. For each configuration, use gcloud
compute instances list to get a list of compute resources. Most Voted

​ B. Create two configurations using gsutil config. Write a script that sets
configurations as active, individually. For each configuration, use gsutil compute
instances list to get a list of compute resources.

​ C. Go to Cloud Shell and export this information to Cloud Storage on a daily


basis.
​ D. Go to GCP Console and export this information to Cloud SQL on a daily basis.

Q83.our organization uses G Suite for communication and collaboration. All users in
your organization have a G Suite account. You want to grant some G Suite users
access to your Cloud Platform project. What should you do?

​ A. Enable Cloud Identity in the GCP Console for your domain.

​ B. Grant them the required IAM roles using their G Suite email address. Most
Voted

​ C. Create a CSV sheet with all users' email addresses. Use the gcloud command
line tool to convert them into Google Cloud Platform accounts.

​ D. In the G Suite console, add the users to a special group called


cloud-console-users@yourdomain.com. Rely on the default behavior of the
Cloud Platform to grant users access if they are members of this group.

Q82. Your VMs are running in a subnet that has a subnet mask of 255.255.255.240.
The current subnet has no more free IP addresses and you require an additional
10 IP addresses for new VMs. The existing and new VMs should all be able to reach
each other without additional routes. What should you do?

​ A. Use gcloud to expand the IP range of the current subnet. Most Voted

​ B. Delete the subnet, and recreate it using a wider range of IP addresses.

​ C. Create a new project. Use Shared VPC to share the current network with the
new project.

​ D. Create a new subnet with the same starting IP but a wider range to overwrite
the current subnet.
Q80. You want to add a new auditor to a Google Cloud Platform project. The auditor
should be allowed to read, but not modify, all project items.
How should you configure the auditor's permissions?

​ A. Create a custom role with view-only project permissions. Add the user's
account to the custom role. Most Voted

​ B. Create a custom role with view-only service permissions. Add the user's
account to the custom role.

​ C. Select the built-in IAM project Viewer role. Add the user's account to this role.
Most Voted

​ D. Select the built-in IAM service Viewer role. Add the user's account to this role.

Q79. You need to enable traffic between multiple groups of Compute Engine instances
that are currently running two different GCP projects. Each group of Compute
Engine instances is running in its own VPC. What should you do?

​ A. Verify that both projects are in a GCP Organization. Create a new VPC and
add all instances.

​ B. Verify that both projects are in a GCP Organization. Share the VPC from
one project and request that the Compute Engine instances in the other
project use this shared VPC. Most Voted

​ C. Verify that you are the Project Administrator of both projects. Create two new
VPCs and add all instances.

​ D. Verify that you are the Project Administrator of both projects. Create a new
VPC and add all instances.
Q78. You are using Google Kubernetes Engine with autoscaling enabled to host a new
application. You want to expose this new application to the public, using HTTPS on a
public IP address. What should you do?

​ A. Create a Kubernetes Service of type NodePort for your application, and a


Kubernetes Ingress to expose this Service via a Cloud Load Balancer. Most
Voted

​ B. Create a Kubernetes Service of type ClusterIP for your application. Configure


the public DNS name of your application using the IP of this Service.

​ C. Create a Kubernetes Service of type NodePort to expose the application on


port 443 of each node of the Kubernetes cluster. Configure the public DNS name
of your application with the IP of every node of the cluster to achieve
load-balancing.

​ D. Create a HAProxy pod in the cluster to load-balance the traffic to all the pods
of the application. Forward the public traffic to HAProxy with an iptable rule.
Configure the DNS name of your application using the public IP of the node
HAProxy is running on.

Q77.You need to provide a cost estimate for a Kubernetes cluster using the GCP pricing
calculator for Kubernetes. Your workload requires high IOPs, and you will also be using
disk snapshots. You start by entering the number of nodes, average hours, and average
days. What should you do next?

​ A. Fill in local SSD. Fill in persistent disk storage and snapshot storage.
Most Voted

​ B. Fill in local SSD. Add estimated cost for cluster management.

​ C. Select Add GPUs. Fill in persistent disk storage and snapshot storage.

​ D. Select Add GPUs. Add estimated cost for cluster management.


Q76.You are building a new version of an application hosted in an App Engine
environment. You want to test the new version with 1% of users before you completely
switch your application over to the new version. What should you do?

​ A. Deploy a new version of your application in Google Kubernetes Engine


instead of App Engine and then use GCP Console to split traffic.

​ B. Deploy a new version of your application in a Compute Engine instance


instead of App Engine and then use GCP Console to split traffic.

​ C. Deploy a new version as a separate app in App Engine. Then configure App
Engine using GCP Console to split traffic between the two apps.

​ D. Deploy a new version of your application in App Engine. Then go to App


Engine settings in GCP Console and split traffic between the current
version and newly deployed versions accordingly. Most Voted

Q75. You need to produce a list of the enabled Google Cloud Platform APIs for a GCP
project using the gcloud command line in the Cloud Shell. The project name is
my-project. What should you do?

​ A. Run gcloud projects list to get the project ID, and then run gcloud
services list --project <project ID>. Most Voted

​ B. Run gcloud init to set the current project to my-project, and then run gcloud
services list --available.

​ C. Run gcloud info to view the account value, and then run gcloud services list
--account <Account>.

​ D. Run gcloud projects describe <project ID> to verify the project value, and then
run gcloud services list --available.
Q74. You want to configure an SSH connection to a single Compute Engine instance for
users in the dev1 group. This instance is the only resource in this particular
Google Cloud Platform project that the dev1 users should be able to connect to. What
should you do?

​ A. Set metadata to enable-oslogin=true for the instance. Grant the dev1


group the compute.osLogin role. Direct them to use the Cloud Shell to ssh
to that instance. Most Voted

​ B. Set metadata to enable-oslogin=true for the instance. Set the service account
to no service account for that instance. Direct them to use the Cloud Shell to ssh
to that instance.

​ C. Enable block project wide keys for the instance. Generate an SSH key for
each user in the dev1 group. Distribute the keys to dev1 users and direct them to
use their third-party tools to connect.

​ D. Enable block project wide keys for the instance. Generate an SSH key and
associate the key with that instance. Distribute the key to dev1 users and direct
them to use their third-party tools to connect.

Q73.You are setting up a Windows VM on Compute Engine and want to make sure you
can log in to the VM via RDP. What should you do?

​ A. After the VM has been created, use your Google Account credentials to log in
into the VM.

​ B. After the VM has been created, use gcloud compute


reset-windows-password to retrieve the login credentials for the VM. Most
Voted
​ C. When creating the VM, add metadata to the instance using
'windows-password' as the key and a password as the value.

​ D. After the VM has been created, download the JSON private key for the default
Compute Engine service account. Use the credentials in the JSON file to log in to
the VM.

Q72.You deployed a new application inside your Google Kubernetes Engine cluster
using the YAML file specified below.

You check the status of the deployed pods and notice that one of them is still in
PENDING status:
You want to find out why the pod is stuck in pending status. What should you do?

​ A. Review details of the myapp-service Service object and check for error
messages.

​ B. Review details of the myapp-deployment Deployment object and check for


error messages.

​ C. Review details of myapp-deployment-58ddbbb995-lp86m Pod and check


for warning messages. Most Voted

​ D. View logs of the container in myapp-deployment-58ddbbb995-lp86m pod and


check for warning messages.

Q71.You are using Container Registry to centrally store your company's container
images in a separate project. In another project, you want to create a Google
Kubernetes Engine (GKE) cluster. You want to ensure that Kubernetes can download
images from Container Registry. What should you do?

​ A. In the project where the images are stored, grant the Storage Object
Viewer IAM role to the service account used by the Kubernetes nodes. Most
Voted

​ B. When you create the GKE cluster, choose the Allow full access to all Cloud
APIs option under 'Access scopes'.

​ C. Create a service account, and give it access to Cloud Storage. Create a P12
key for this service account and use it as an imagePullSecrets in Kubernetes.
​ D. Configure the ACLs on each image in Cloud Storage to give read-only access
to the default Compute Engine service account.

You might also like