Associate Cloud Engineer 6
Associate Cloud Engineer 6
Get the Full Associate-Cloud-Engineer dumps in VCE and PDF From SurePassExam
https://www.surepassexam.com/Associate-Cloud-Engineer-exam-dumps.html (244 New Questions)
Google
Exam Questions Associate-Cloud-Engineer
Google Cloud Certified - Associate Cloud Engineer
NEW QUESTION 1
Your company has multiple projects linked to a single billing account in Google Cloud. You need to visualize the costs with specific metrics that should be
dynamically calculated based on company-specific criteria. You want to automate the process. What should you do?
A. In the Google Cloud console, visualize the costs related to the projects in the Reports section.
B. In the Google Cloud console, visualize the costs related to the projects in the Cost breakdown section.
C. In the Google Cloud console, use the export functionality of the Cost tabl
D. Create a Looker Studiodashboard on top of the CSV export.
E. Configure Cloud Billing data export to BigOuery for the billing accoun
F. Create a Looker Studio dashboard on top of the BigOuery export.
Answer: D
NEW QUESTION 2
Your organization has a dedicated person who creates and manages all service accounts for Google Cloud projects. You need to assign this person the minimum
role for projects. What should you do?
Answer: D
NEW QUESTION 3
You have been asked to set up the billing configuration for a new Google Cloud customer. Your customer wants to group resources that share common IAM
policies. What should you do?
Answer: B
Explanation:
Folders are nodes in the Cloud Platform Resource Hierarchy. A folder can contain projects, other folders, or a combination of both. Organizations can use folders
to group projects under the organization node in a hierarchy. For example, your organization might contain multiple departments, each with its own set of Google
Cloud resources. Folders allow you to group these resources on a per-department basis. Folders are used to group resources that share common IAM policies.
While a folder can contain multiple folders or resources, a given folder or resource can have exactly one parent.
https://cloud.google.com/resource-manager/docs/creating-managing-folders
NEW QUESTION 4
You have 32 GB of data in a single file that you need to upload to a Nearline Storage bucket. The WAN connection you are using is rated at 1 Gbps, and you are
the only one on the connection. You want to use as much of the rated 1 Gbps as possible to transfer the file rapidly. How should you upload the file?
Answer: B
Explanation:
https://cloud.google.com/storage/docs/parallel-composite-uploads https://cloud.google.com/storage/docs/uploads-downloads#parallel-composite-uploads
NEW QUESTION 5
Your company's security vulnerability management policy wonts 3 member of the security team to have visibility into vulnerabilities and other OS metadata for a
specific Compute Engine instance This Compute Engine instance hosts a critical application in your Goggle Cloud project. You need to implement your company's
security vulnerability management policy. What should you dc?
A. • Ensure that the Ops Agent Is Installed on the Compute Engine instance.• Create a custom metric in the Cloud Monitoring dashboard.• Provide the security
team member with access to this dashboard.
B. • Ensure that the Ops Agent is installed on tie Compute Engine instance.• Provide the security team member roles/configure.inventoryViewer permission.
C. • Ensure that the OS Config agent Is Installed on the Compute Engine instance.• Provide the security team member roles/configure.vulnerabilityViewer
permission.
D. • Ensure that the OS Config agent is installed on the Compute Engine instance• Create a log sink Co a BigQuery dataset.• Provide the security team member
with access to this dataset.
Answer: C
NEW QUESTION 6
You need to manage a third-party application that will run on a Compute Engine instance. Other Compute Engine instances are already running with default
configuration. Application installation files are hosted on Cloud Storage. You need to access these files from the new instance without allowing other virtual
machines (VMs) to access these files. What should you do?
A. Create the instance with the default Compute Engine service account Grant the service account permissions on Cloud Storage.
B. Create the instance with the default Compute Engine service account Add metadata to the objects on Cloud Storage that matches the metadata on the new
instance.
C. Create a new service account and assig n this service account to the new instance Grant the service account permissions on Cloud Storage.
D. Create a new service account and assign this service account to the new instance Add metadata to the objects on Cloud Storage that matches the metadata on
the new instance.
Answer: B
Explanation:
https://cloud.google.com/iam/docs/best-practices-for-using-and-managing-service-accounts
If an application uses third-party or custom identities and needs to access a resource, such as a BigQuery dataset or a Cloud Storage bucket, it must perform a
transition between principals. Because Google Cloud APIs don't recognize third-party or custom identities, the application can't propagate the end-user's identity to
BigQuery or Cloud Storage. Instead, the application has to perform the access by using a different Google identity.
NEW QUESTION 7
You are building a data lake on Google Cloud for your Internet of Things (loT) application. The loT application has millions of sensors that are constantly streaming
structured and unstructured data to your backend in the cloud. You want to build a highly available and resilient architecture based on
Google-recommended practices. What should you do?
A. Stream data to Pub/Sub, and use Dataflow to send data to Cloud Storage
B. Stream data to Pub/Su
C. and use Storage Transfer Service to send data to BigQuery.
D. Stream data to Dataflow, and use Storage Transfer Service to send data to BigQuery.
E. Stream data to Dataflow, and use Dataprep by Trifacta to send data to Bigtable.
Answer: B
NEW QUESTION 8
You have an object in a Cloud Storage bucket that you want to share with an external company. The object contains sensitive data. You want access to the
content to be removed after four hours. The external company does not have a Google account to which you can grant specific user-based access privileges. You
want to use the most secure method that requires the fewest steps. What should you do?
A. Create a signed URL with a four-hour expiration and share the URL with the company.
B. Set object access to ‘public’ and use object lifecycle management to remove the object after four hours.
C. Configure the storage bucket as a static website and furnish the object’s URL to the compan
D. Delete the object from the storage bucket after four hours.
E. Create a new Cloud Storage bucket specifically for the external company to acces
F. Copy the object to that bucke
G. Delete the bucket after four hours have passed.
Answer: A
Explanation:
Signed URLs are used to give time-limited resource access to anyone in possession of the URL, regardless of whether they have a Google account.
https://cloud.google.com/storage/docs/access-control/signed-urls
NEW QUESTION 9
You have created an application that is packaged into a Docker image. You want to deploy the Docker image as a workload on Google Kubernetes Engine. What
should you do?
A. Upload the image to Cloud Storage and create a Kubernetes Service referencing the image.
B. Upload the image to Cloud Storage and create a Kubernetes Deployment referencing the image.
C. Upload the image to Container Registry and create a Kubernetes Service referencing the image.
D. Upload the image to Container Registry and create a Kubernetes Deployment referencing the image.
Answer: D
Explanation:
A deployment is responsible for keeping a set of pods running. A service is responsible for enabling network access to a set of pods.
NEW QUESTION 10
You need to set up permissions for a set of Compute Engine instances to enable them to write data into a particular Cloud Storage bucket. You want to follow
Google-recommended practices. What should you do?
Answer: C
Explanation:
https://cloud.google.com/iam/docs/understanding-service-accounts#using_service_accounts_with_compute_eng https://cloud.google.com/storage/docs/access-
control/iam-roles
NEW QUESTION 10
You are deploying an application to App Engine. You want the number of instances to scale based on request rate. You need at least 3 unoccupied instances at all
times. Which scaling type should you use?
Answer: D
NEW QUESTION 15
You need to manage a Cloud Spanner Instance for best query performance. Your instance in production runs in a single Google Cloud region. You need to
improve performance in the shortest amount of time. You want to follow Google best practices for service configuration. What should you do?
A. Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 45% If you exceed this threshold, add nodes lo your
instance.
B. Create an alert in Cloud Monitoring to alert when the percentage to high priority CPU utilization reaches 45% Use database query statistics to identify queries
that result in high CPU usage, and then rewrite those queries to optimize their resource usage
C. Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 65% If you exceed this threshold, add nodes to your
instance
D. Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 65%. Use database query statistics to identity queries
that result in high CPU usage, and then rewrite those queries to optimize their resource usage.
Answer: B
Explanation:
https://cloud.google.com/spanner/docs/cpu-utilization#recommended-max
NEW QUESTION 17
You are building a pipeline to process time-series data. Which Google Cloud Platform services should you put in boxes 1,2,3, and 4?
Answer: D
NEW QUESTION 22
You are in charge of provisioning access for all Google Cloud users in your organization. Your company recently acquired a startup company that has their own
Google Cloud organization. You need to ensure that your Site Reliability Engineers (SREs) have the same project permissions in the startup company's
organization as in your own organization. What should you do?
A. In the Google Cloud console for your organization, select Create role from selection, and choose destination as the startup company's organization
B. In the Google Cloud console for the startup company, select Create role from selection and choose source as the startup company's Google Cloud organization.
C. Use the gcloud iam roles copy command, and provide the Organization ID of the startup company's Google Cloud Organization as the destination.
D. Use the gcloud iam roles copy command, and provide the project IDs of all projects in the startup company s organization as the destination.
Answer: C
Explanation:
https://cloud.google.com/architecture/best-practices-vpc-design#shared-service Cloud VPN is another alternative. Because Cloud VPN establishes reachability
through managed IPsec tunnels, it doesn't have the aggregate limits of VPC Network Peering. Cloud VPN uses a VPN Gateway for connectivity and doesn't
consider the aggregate resource use of the IPsec peer. The drawbacks of Cloud VPN include increased costs (VPN tunnels and traffic egress), management
overhead required to maintain tunnels, and the performance overhead of IPsec.
NEW QUESTION 24
You create a new Google Kubernetes Engine (GKE) cluster and want to make sure that it always runs a supported and stable version of Kubernetes. What should
you do?
Answer: B
Explanation:
Creating or upgrading a cluster by specifying the version as latest does not provide automatic upgrades. Enable node auto-upgrades to ensure that the nodes in
your cluster are up-to-date with the latest stable version.
https://cloud.google.com/kubernetes-engine/versioning-and-upgrades
Node auto-upgrades help you keep the nodes in your cluster up to date with the cluster master version when your master is updated on your behalf. When you
create a new cluster or node pool with Google Cloud Console or the gcloud command, node auto-upgrade is enabled by default.
Ref: https://cloud.google.com/kubernetes-engine/docs/how-to/node-auto-upgrades
NEW QUESTION 27
Your managed instance group raised an alert stating that new instance creation has failed to create new instances. You need to maintain the number of running
instances specified by the template to be able to process expected application traffic. What should you do?
A. Create an instance template that contains valid syntax which will be used by the instance grou
B. Delete any persistent disks with the same name as instance names.
C. Create an instance template that contains valid syntax that will be used by the instance grou
D. Verify that the instance name and persistent disk name values are not the same in the template.
E. Verify that the instance template being used by the instance group contains valid synta
F. Delete any persistent disks with the same name as instance name
G. Set the disks.autoDelete property to true in the instance template.
H. Delete the current instance template and replace it with a new instance templat
I. Verify that the instance name and persistent disk name values are not the same in the templat
J. Set the disks.autoDelete property to true in the instance template.
Answer: A
Explanation:
https://cloud.google.com/compute/docs/troubleshooting/troubleshooting-migs https://cloud.google.com/compute/docs/instance-
templates#how_to_update_instance_templates
NEW QUESTION 32
You are building an application that processes data files uploaded from thousands of suppliers. Your primary goals for the application are data security and the
expiration of aged data. You need to design the application to:
•Restrict access so that suppliers can access only their own data.
•Give suppliers write access to data only for 30 minutes.
•Delete data that is over 45 days old.
You have a very short development cycle, and you need to make sure that the application requires minimal maintenance. Which two strategies should you use?
(Choose two.)
Answer: AB
Explanation:
(A) Object Lifecycle Management Delete
The Delete action deletes an object when the object meets all conditions specified in the lifecycle rule.
Exception: In buckets with Object Versioning enabled, deleting the live version of an object causes it to become a noncurrent version, while deleting a noncurrent
version deletes that version permanently.
https://cloud.google.com/storage/docs/lifecycle#delete
(B) Signed URLs
This page provides an overview of signed URLs, which you use to give time-limited resource access to anyone in possession of the URL, regardless of whether
they have a Google account
https://cloud.google.com/storage/docs/access-control/signed-urls
NEW QUESTION 37
You have an application that runs on Compute Engine VM instances in a custom Virtual Private Cloud (VPC). Your company's security policies only allow the use
to internal IP addresses on VM instances and do not let VM instances connect to the internet. You need to ensure that the application can access a file hosted in a
Cloud Storage bucket within your project. What should you do?
Answer: A
NEW QUESTION 42
You are assigned to maintain a Google Kubernetes Engine (GKE) cluster named dev that was deployed on Google Cloud. You want to manage the GKE
configuration using the command line interface (CLI). You have just downloaded and installed the Cloud SDK. You want to ensure that future CLI commands by
default address this specific cluster. What should you do?
Answer: A
Explanation:
To set a default cluster for gcloud commands, run the following command: gcloud config set container/cluster CLUSTER_NAME
https://cloud.google.com/kubernetes-engine/docs/how-to/managing-clusters?hl=en
NEW QUESTION 43
You have a Google Cloud Platform account with access to both production and development projects. You need to create an automated process to list all compute
instances in development and production projects on a daily basis. What should you do?
Answer: A
Explanation:
You can create two configurations – one for the development project and another for the production project. And you do that by running “gcloud config
configurations create” command.https://cloud.google.com/sdk/gcloud/reference/config/configurations/createIn your custom script, you can load these
configurations one at a time and execute gcloud compute instances list to list Google Compute Engine instances in the project that is active in the gcloud
configuration.Ref: https://cloud.google.com/sdk/gcloud/reference/compute/instances/listOnce you have this information, you can export it in a suitable format to a
suitable target e.g. export as CSV or export to Cloud Storage/BigQuery/SQL, etc
NEW QUESTION 45
You want to run a single caching HTTP reverse proxy on GCP for a latency-sensitive website. This specific reverse proxy consumes almost no CPU. You want to
have a 30-GB in-memory cache, and need an additional 2 GB of memory for the rest of the processes. You want to minimize cost. How should you run this reverse
proxy?
Answer: A
Explanation:
What is Google Cloud Memorystore?
Overview. Cloud Memorystore for Redis is a fully managed Redis service for Google Cloud Platform. Applications running on Google Cloud Platform can achieve
extreme performance by leveraging the highly scalable, highly available, and secure Redis service without the burden of managing complex Redis deployments.
NEW QUESTION 46
You are building an archival solution for your data warehouse and have selected Cloud Storage to archive your data. Your users need to be able to access this
archived data once a quarter for some regulatory requirements. You want to select a cost-efficient option. Which storage option should you use?
A. Coldline Storage
B. Nearline Storage
C. Regional Storage
D. Multi-Regional Storage
Answer: A
Explanation:
Coldline Storage is a very-low-cost, highly durable storage service for storing infrequently accessed data. Coldline Storage is ideal for data you plan to read or
modify at most once a quarter. Since we have a requirement to access data once a quarter and want to go with the most cost-efficient option, we should select
Coldline Storage.
Ref: https://cloud.google.com/storage/docs/storage-classes#coldline
NEW QUESTION 50
You manage an App Engine Service that aggregates and visualizes data from BigQuery. The application is deployed with the default App Engine Service account.
The data that needs to be visualized resides in a different project managed by another team. You do not have access to this project, but you want your application
to be able to read data from the BigQuery dataset. What should you do?
A. Ask the other team to grant your default App Engine Service account the role of BigQuery Job User.
B. Ask the other team to grant your default App Engine Service account the role of BigQuery Data Viewer.
C. In Cloud IAM of your project, ensure that the default App Engine service account has the role of BigQuery Data Viewer.
D. In Cloud IAM of your project, grant a newly created service account from the other team the role of BigQuery Job User in your project.
Answer: B
Explanation:
The resource that you need to get access is in the other project. roles/bigquery.dataViewer BigQuery Data Viewer
When applied to a table or view, this role provides permissions to: Read data and metadata from the table or view.
This role cannot be applied to individual models or routines. When applied to a dataset, this role provides permissions to:
Read the dataset's metadata and list tables in the dataset. Read data and metadata from the dataset's tables.
When applied at the project or organization level, this role can also enumerate all datasets in the project. Additional roles, however, are necessary to allow the
running of jobs.
NEW QUESTION 55
For analysis purposes, you need to send all the logs from all of your Compute Engine instances to a BigQuery dataset called platform-logs. You have already
installed the Stackdriver Logging agent on all the instances. You want to minimize cost. What should you do?
A. 1. Give the BigQuery Data Editor role on the platform-logs dataset to the service accounts used by your instances.2. Update your instances’ metadata to add
the following value: logs-destination:bq://platform-logs.
B. 1. In Stackdriver Logging, create a logs export with a Cloud Pub/Sub topic called logs as a sink.2.Create a Cloud Function that is triggered by messages in the
logs topic.3. Configure that Cloud Function to drop logs that are not from Compute Engine and to insert Compute Engine logs in the platform-logs dataset.
C. 1. In Stackdriver Logging, create a filter to view only Compute Engine logs.2. Click Create Export.3.Choose BigQuery as Sink Service, and the platform-logs
dataset as Sink Destination.
D. 1. Create a Cloud Function that has the BigQuery User role on the platform-logs dataset.2. Configure this Cloud Function to create a BigQuery Job that
executes this query:INSERT INTOdataset.platform-logs (timestamp, log)SELECT timestamp, log FROM compute.logsWHERE timestamp>
DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)3. Use Cloud Scheduler to trigger this Cloud Function once a day.
Answer: C
Explanation:
* 1. In Stackdriver Logging, create a filter to view only Compute Engine logs. 2. Click Create Export. 3. Choose BigQuery as Sink Service, and the platform-logs
dataset as Sink Destination.
NEW QUESTION 56
You need to reduce GCP service costs for a division of your company using the fewest possible steps. You need to turn off all configured services in an existing
GCP project. What should you do?
A. * 1. Verify that you are assigned the Project Owners IAM role for this project.* 2. Locate the project in the GCP console, click Shut down and then enter the
project ID.
B. * 1. Verify that you are assigned the Project Owners IAM role for this project.* 2. Switch to the project in the GCP console, locate the resources and delete them.
C. * 1. Verify that you are assigned the Organizational Administrator IAM role for this project.* 2. Locate the project in the GCP console, enter the project ID and
then click Shut down.
D. * 1. Verify that you are assigned the Organizational Administrators IAM role for this project.* 2. Switch to the project in the GCP console, locate the resources
and delete them.
Answer: A
Explanation:
https://cloud.google.com/run/docs/tutorials/gcloud https://cloud.google.com/resource-manager/docs/creating-managing-projects
https://cloud.google.com/iam/docs/understanding-roles#primitive_roles
You can shut down projects using the Cloud Console. When you shut down a project, this immediately happens: All billing and traffic serving stops, You lose
access to the project, The owners of the project will be notified and can stop the deletion within 30 days, The project will be scheduled to be deleted after 30 days.
However, some resources may be deleted much earlier.
NEW QUESTION 60
You are developing a financial trading application that will be used globally. Data is stored and queried using a relational structure, and clients from all over the
world should get the exact identical state of the data. The application will be deployed in multiple regions to provide the lowest latency to end users. You need to
select a storage option for the application data while minimizing latency. What should you do?
Answer: C
Explanation:
Keywords, Financial data (large data) used globally, data stored and queried using relational structure (SQL), clients should get exact identical copies(Strong
Consistency), Multiple region, low latency to end user, select storage option to minimize latency.
NEW QUESTION 64
You need to grant access for three users so that they can view and edit table data on a Cloud Spanner instance. What should you do?
Answer: B
Explanation:
https://cloud.google.com/spanner/docs/iam#spanner.databaseUser
Using the gcloud tool, execute the gcloud iam roles describe roles/spanner.databaseUser command on Cloud Shell. Attach the users to a newly created Google
group and add the group to the role.
NEW QUESTION 69
You have successfully created a development environment in a project for an application. This application uses Compute Engine and Cloud SQL. Now, you need
to create a production environment for this application.
The security team has forbidden the existence of network routes between these 2 environments, and asks you to follow Google-recommended practices. What
should you do?
A. Create a new project, enable the Compute Engine and Cloud SQL APIs in that project, and replicate the setup you have created in the development
environment.
B. Create a new production subnet in the existing VPC and a new production Cloud SQL instance in your existing project, and deploy your application using those
resources.
C. Create a new project, modify your existing VPC to be a Shared VPC, share that VPC with your new project, and replicate the setup you have in the
development environment in that new project, in the Shared VPC.
D. Ask the security team to grant you the Project Editor role in an existing production project used by another division of your compan
E. Once they grant you that role, replicate the setup you have in the development environment in that project.
Answer: A
Explanation:
This aligns with Googles recommended practices. By creating a new project, we achieve complete isolation between development and production environments;
as well as isolate this production application from production applications of other departments.
Ref: https://cloud.google.com/docs/enterprise/best-practices-for-enterprise-organizations#define-hierarchy
NEW QUESTION 74
You want to find out when users were added to Cloud Spanner Identity Access Management (IAM) roles on your Google Cloud Platform (GCP) project. What
should you do in the GCP Console?
Answer: D
Explanation:
https://cloud.google.com/monitoring/audit-logging
NEW QUESTION 79
You are building a multi-player gaming application that will store game information in a database. As the popularity of the application increases, you are concerned
about delivering consistent performance. You need to ensure an optimal gaming performance for global users, without increasing the management complexity.
What should you do?
A. Use Cloud SQL database with cross-region replication to store game statistics in the EU, US, and APAC regions.
B. Use Cloud Spanner to store user data mapped to the game statistics.
C. Use BigQuery to store game statistics with a Redis on Memorystore instance in the front to provide global consistency.
D. Store game statistics in a Bigtable database partitioned by username.
Answer: B
NEW QUESTION 81
You are working in a team that has developed a new application that needs to be deployed on Kubernetes. The production application is business critical and
should be optimized for reliability. You need to provision a Kubernetes cluster and want to follow Google-recommended practices. What should you do?
Answer: B
Explanation:
Autopilot is more reliable and stable release gives more time to fix issues in new version of GKE
NEW QUESTION 86
You have one GCP account running in your default region and zone and another account running in a
non-default region and zone. You want to start a new Compute Engine instance in these two Google Cloud Platform accounts using the command line interface.
What should you do?
A. Create two configurations using gcloud config configurations create [NAME]. Run gcloud config configurations activate [NAME] to switch between accounts
when running the commands to start the Compute Engine instances.
B. Create two configurations using gcloud config configurations create [NAME]. Run gcloud configurations list to start the Compute Engine instances.
C. Activate two configurations using gcloud configurations activate [NAME]. Run gcloud config list to start the Compute Engine instances.
D. Activate two configurations using gcloud configurations activate [NAME]. Run gcloud configurations list to start the Compute Engine instances.
Answer: A
Explanation:
"Run gcloud configurations list to start the Compute Engine instances". How the heck are you expecting to "start" GCE instances doing "configuration list".
Each gcloud configuration has a 1 to 1 relationship with the region (if a region is defined). Since we have two different regions, we would need to create two
separate configurations using gcloud config configurations createRef: https://cloud.google.com/sdk/gcloud/reference/config/configurations/create
Secondly, you can activate each configuration independently by running gcloud config configurations activate [NAME]Ref:
https://cloud.google.com/sdk/gcloud/reference/config/configurations/activate
Finally, while each configuration is active, you can run the gcloud compute instances start [NAME] command to start the instance in the configurations
region.https://cloud.google.com/sdk/gcloud/reference/compute/instances/start
NEW QUESTION 91
You created an instance of SQL Server 2017 on Compute Engine to test features in the new version. You want to connect to this instance using the fewest number
of steps. What should you do?
Answer: D
Explanation:
https://cloud.google.com/compute/docs/instances/connecting-to-windows#remote-desktop-connection-app
https://cloud.google.com/compute/docs/instances/windows/generating-credentials https://cloud.google.com/compute/docs/instances/connecting-to-
windows#before-you-begin
NEW QUESTION 94
You need to deploy an application in Google Cloud using savorless technology. You want to test a new version of the application with a small percentage of
production traffic. What should you do?
Answer: A
NEW QUESTION 99
You need to host an application on a Compute Engine instance in a project shared with other teams. You want to prevent the other teams from accidentally
causing downtime on that application. Which feature should you use?
Answer: D
Explanation:
As part of your workload, there might be certain VM instances that are critical to running your application or services, such as an instance running a SQL server, a
server used as a license manager, and so on. These VM instances might need to stay running indefinitely so you need a way to protect these VMs from being
deleted. By setting the deletionProtection flag, a VM instance can be protected from accidental deletion. If a user attempts to delete a VM instance for which you
have set the deletionProtection flag, the request fails. Only a user that has been granted a role with compute.instances.create permission can reset the flag to
allow the resource to be deleted.Ref: https://cloud.google.com/compute/docs/instances/preventing-accidental-vm-deletion
Answer: A
Explanation:
App engine is a regional service, which means the infrastructure that runs your app(s) is located in a specific region and is managed by Google to be redundantly
available across all the zones within that region. Once an app engine deployment is created in a region, it cant be changed. The only way is to create a new project
and create an App Engine instance in europe-west3, send all user traffic to this instance and delete the app engine instance in us-central.
Ref: https://cloud.google.com/appengine/docs/locations
Answer: C
A. Set up a budget alert on the project with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”
B. Set up a budget alert on the billing account with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”
C. Export the billing data to BigQuer
D. Create a Cloud Function that uses BigQuery to sum the egress network costs of the exported billing data for the Apache web server for the current month and
sends an email if it is over 100 dollar
E. Schedule the Cloud Function using Cloud Scheduler to run hourly.
F. Use the Stackdriver Logging Agent to export the Apache web server logs to Stackdriver Logging.Create a Cloud Function that uses BigQuery to parse the HTTP
response log data in Stackdriver for thecurrent month and sends an email if the size of all HTTP responses, multiplied by current GCP egress prices, totals over
100 dollar
G. Schedule the Cloud Function using Cloud Scheduler to run hourly.
Answer: C
Explanation:
https://blog.doit-intl.com/the-truth-behind-google-cloud-egress-traffic-6e8f57b5c2f8
A. Deploy me application on App Engine For each update, create a new version of the same service Configure traffic splitting to send a small percentage of traffic
to the new version
B. Deploy the application on App Engine For each update, create a new service Configure traffic splitting to send a small percentage of traffic to the new service.
C. Deploy the application on Kubernetes Engine For a new release, update the deployment to use the new version
D. Deploy the application on Kubernetes Engine For a now release, create a new deployment for the new version Update the service e to use the now deployment.
Answer: D
Explanation:
Keyword, Version, traffic splitting, App Engine supports traffic splitting for versions before releasing.
A. Save file automatic first-of-the- month backup for three years Store the backup file in an Archive class Cloud Storage bucket
B. Convert the automatic first-of-the-month backup to an export file Write the export file to a Coldline class Cloud Storage bucket
C. Set up an export job for the first of the month Write the export file to an Archive class Cloud Storage bucket
D. Set up an on-demand backup tor the first of the month Write the backup to an Archive class Cloud Storage bucket
Answer: C
Explanation:
https://cloud.google.com/sql/docs/mysql/backup-recovery/backups#can_i_export_a_backup https://cloud.google.com/sql/docs/mysql/import-
export#automating_export_operations
Answer: C
Explanation:
Logged onto console and followed the steps and was able to see all the assigned users and roles.
Answer: B
Explanation:
Google Cloud provides Cloud Audit Logs, which is an integral part of Cloud Logging. It consists of two log streams for each project: Admin Activity and Data
Access, which are generated by Google Cloud services to help you answer the question of who did what, where, and when? within your Google Cloud projects.
Ref: https://cloud.google.com/iam/docs/job-functions/auditing#scenario_external_auditors
A. Use Cloud Logging filters to create log-based metrics for firewall and instance action
B. Monitor the changes and set up reasonable alerts.
C. Install Kibana on a compute Instanc
D. Create a log sink to forward Cloud Audit Logs filtered for firewalls and compute instances to Pub/Su
E. Target the Pub/Sub topic to push messages to the Kibana instanc
F. Analyze the logs on Kibana in real time.
G. Turn on Google Cloud firewall rules logging, and set up alerts for any insert, update, or delete events.
H. Create a log sink to forward Cloud Audit Logs filtered for firewalls and compute instances to Cloud Storage.Use BigQuery to periodically analyze log events in
the storage bucket.
Answer: A
Explanation:
This answer is the simplest and most effective way to monitor unexpected firewall changes and instance creation in Google Cloud. Cloud Logging filters allow you
to specify the criteria for the log entries that you want to view or export. You can use the Logging query language to write filters based on the LogEntry fields, such
as resource.type, severity, or protoPayload.methodName. For example, you can filter for firewall-related events by using the following query:
resource.type=“gce_subnetwork” logName=“projects/PROJECT_ID/logs/compute.googleapis.com%2Ffirewall”
You can filter for instance-related events by using the following query: resource.type=“gce_instance”
logName=“projects/PROJECT_ID/logs/compute.googleapis.com%2Factivity_log”
You can create log-based metrics from these filters to measure the rate or count of log entries that match the filter. Log-based metrics can be used to create charts
and dashboards in Cloud Monitoring, or to set up alerts based on the metric values. For example, you can create an alert policy that triggers when the log-based
metric for firewall changes exceeds a certain threshold in a given time interval. This way, you can get notified of any unexpected or malicious changes to your
firewall rules.
Option B is incorrect because it is unnecessarily complex and costly. Installing Kibana on a compute instance requires additional configuration and maintenance.
Creating a log sink to forward Cloud Audit Logs to Pub/Sub also incurs additional charges for the Pub/Sub service. Analyzing the logs on Kibana in real time may
not be feasible or efficient, as it requires constant monitoring and manual intervention.
Option C is incorrect because Google Cloud firewall rules logging is a different feature from Cloud Audit Logs. Firewall rules logging allows you to audit, verify, and
analyze the effects of your firewall rules by creating connection records for each rule that applies to traffic. However, firewall rules logging does not log the insert,
update, or delete events for the firewall rules themselves. Those events are logged by Cloud Audit Logs, which record the administrative activities in your Google
Cloud project.
Option D is incorrect because it is not a real-time solution. Creating a log sink to forward Cloud Audit Logs to Cloud Storage requires additional storage space and
charges. Using BigQuery to periodically analyze log events in the storage bucket also incurs additional costs for the BigQuery service. Moreover, this option does
not provide any alerting mechanism to notify you of any unexpected or malicious changes to your firewall rules or instances.
A. Create a Service Account in your own project, and grant this Service Account access to BigGuery in your project
B. Create a Service Account in your own project, and ask the partner to grant this Service Account access to BigQuery in their project
C. Ask the partner to create a Service Account in their project, and have them give the Service Account access to BigQuery in their project
D. Ask the partner to create a Service Account in their project, and grant their Service Account access to the BigQuery dataset in your project
Answer: D
Explanation:
https://gtseres.medium.com/using-service-accounts-across-projects-in-gcp-cf9473fef8f0#:~:text=Go%20to%20t
Answer: C
Answer: D
A. Use a Deployment Manager script to automate creating storage buckets in an appropriate region
B. Use a local SSD to improve performance of the VM for the targeted workload
C. Use the gsutii command to create a storage bucket in the same region as the VM
D. Use a Persistent Disk SSD in the same zone as the VM to improve performance of the VM
Answer: A
A. Ask the auditor for their Google account, and give them the Viewer role on the project.
B. Ask the auditor for their Google account, and give them the Security Reviewer role on the project.
C. Create a temporary account for the auditor in Cloud Identity, and give that account the Viewer role on the project.
D. Create a temporary account for the auditor in Cloud Identity, and give that account the Security Reviewer role on the project.
Answer: C
Explanation:
Using primitive roles The following table lists the primitive roles that you can grant to access a project, the description of what the role does, and the permissions
bundled within that role. Avoid using primitive roles except when absolutely necessary. These roles are very powerful, and include a large number of permissions
across all Google Cloud services. For more details on when you should use primitive roles, see the Identity and Access Management FAQ. IAM predefined roles
are much more granular, and allow you to carefully manage the set of permissions that your users have access to. See Understanding Roles for a list of roles that
can be granted at the project level. Creating custom roles can further increase the control you have over user permissions. https://cloud.google.com/resource-
manager/docs/access-control-proj#using_primitive_roles
https://cloud.google.com/iam/docs/understanding-custom-roles
Answer: C
Explanation:
If your apps are fault-tolerant and can withstand possible instance preemptions, then preemptible instances can reduce your Compute Engine costs significantly.
For example, batch processing jobs can run on preemptible instances. If some of those instances stop during processing, the job slows but does not completely
stop. Preemptible instances complete your batch processing tasks without placing additional workload on your existing instances and without requiring you to pay
full price for additional normal instances.
https://cloud.google.com/compute/docs/instances/preemptible
Answer: A
Answer: C
Explanation:
We need to apply the GCP Best practices. roles/browser Browser Read access to browse the hierarchy for a project, including the folder, organization, and IAM
policy. This role doesn't include permission to view resources in the project. https://cloud.google.com/iam/docs/understanding-roles
* Associate-Cloud-Engineer Most Realistic Questions that Guarantee you a Pass on Your FirstTry
* Associate-Cloud-Engineer Practice Test Questions in Multiple Choice Formats and Updatesfor 1 Year