0% found this document useful (0 votes)
8 views8 pages

Professional Cloud Architect

The document contains a series of questions and answers related to the Google Certified Professional - Cloud Architect exam. It covers various topics such as optimizing Dockerfiles, PCI compliance, disaster recovery, file uploads to Cloud Storage, and monitoring application performance. Each question includes correct answers and explanations for the recommended solutions.

Uploaded by

Siddharth Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views8 pages

Professional Cloud Architect

The document contains a series of questions and answers related to the Google Certified Professional - Cloud Architect exam. It covers various topics such as optimizing Dockerfiles, PCI compliance, disaster recovery, file uploads to Cloud Storage, and monitoring application performance. Each question includes correct answers and explanations for the recommended solutions.

Uploaded by

Siddharth Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Google

Professional-Cloud-Architect
Google Certified Professional - Cloud Architect
QUESTION & ANSWERS

https://www.certsquestions.com/Professional-Cloud-Architect-pdf-dumps.html
QUESTION 1

One of the developers on your team deployed their application in Google Container Engine with the
Dockerfile below. They report that their application deployments are taking too long.

You want to optimize this Dockerfile for faster deployment times without adversely affecting the app’s
functionality. Which two actions should you take? Choose 2 answers.
A. Remove Python after running pip.
B. Remove dependencies from requirements.txt
C. Use a slimmed-down base image like Alpine linux.
D. Use larger machine types for your Google Container Engine node pools.
E. Copy the source after the package dependencies (Python and pip) are installed.

Correct Answer: C,E

Explanation/Reference:

The speed of deployment can be changed by limiting the size of the uploaded app, limiting the
complexity of the build necessary in the Dockerfile, if present, and by ensuring a fast and reliable
internet connection. Note: Alpine Linux is built around musl libc and busybox. This makes it smaller
and more resource efficient than traditional GNU/Linux distributions. A container requires no more
than 8 MB and a minimal installation to disk requires around 130 MB of storage. Not only do you get a
fully-fledged Linux environment but a large selection of packages from the repository. References:
https://groups.google.com/forum/#!topic/google-appengine/hZMEkmmObDU
https://www.alpinelinux.org/about/

QUESTION 2

Your application needs to process credit card transactions. You want the smallest scope of Payment
Card Industry (PCI) compliance without compromising the ability to analyze transactional data and
trends relating to which payment methods are used. How should you design your architecture?
A. Create a tokenizer service and store only tokenized data.
B. Create separate projects that only process credit card data.
C. Create separate subnetworks and isolate the components that process credit card data.
D. Streamline the audit discovery phase by labeling all of the virtual machines (VMs) that process PCI
data.
E. Enable Logging export to Google BigQuery and use ACLs and views to scope the data shared with
the auditor.

https://www.certsquestions.com/Professional-Cloud-Architect-pdf-dumps.html
Correct Answer: A

Explanation/Reference:

Reference:
https://www.sans.org/reading-room/whitepapers/compliance/ways-reduce-pci-dss-audit-
scopetokenizing-cardholder-data-33194

QUESTION 3

As part of implementing their disaster recovery plan, your company is trying to replicate
their production MySQL database from their private data center to their GCP project using a Google
Cloud VPN connection. They are experiencing latency issues and a small amount of packet loss that is
disrupting the replication. What should they do?
A. Configure their replication to use UDP.
B. Configure a Google Cloud Dedicated Interconnect.
C. Restore their database daily using Google Cloud SQL.
D. Add additional VPN connections and load balance them.
E. Send the replicated transaction to Google Cloud Pub/Sub.

Correct Answer: B

QUESTION 4

You need to upload files from your on-premises environment to Cloud Storage. You want the files to
be encrypted on Cloud Storage using customer-supplied encryption keys. What should you do?
A. Supply the encryption key in a .boto configuration file. Use gsutil to upload the files.
B. Supply the encryption key using gcloud config. Use gsutil to upload the files to that bucket.
C. Use gsutil to upload the files, and use the flag --encryption-key to supply the encryption key.
D. Use gsutil to create a bucket, and use the flag --encryption-key to supply the encryption key. Use
gsutil to upload the files to that bucket.

Correct Answer: D

QUESTION 5

You are using Cloud CDN to deliver static HTTP(S) website content hosted on a Compute Engine
instance group. You want to improve the cache hit ratio.
What should you do?

https://www.certsquestions.com/Professional-Cloud-Architect-pdf-dumps.html
A. Customize the cache keys to omit the protocol from the key.
B. Shorten the expiration time of the cached objects.
C. Make sure the HTTP(S) header “Cache-Region” points to the closest region of your users.
D. Replicate the static content in a Cloud Storage bucket. Point CloudCDN toward a load balancer
onthatbucket.

Correct Answer: A

Explanation/Reference:

Reference https://cloud.google.com/cdn/docs/bestpractices#
using_custom_cache_keys_to_improve_cache_hit_ratio

QUESTION 6

For this question, refer to the JencoMart case study. JencoMart has built a version of their application
on Google Cloud Platform that serves traffic to Asi a. You want to measure success against their
business and technical goals. Which metrics should you track?
A. Error rates for requests from Asia
B. Latency difference between US and Asia
C. Total visits, error rates, and latency from Asia
D. Total visits and average latency for users in Asia
E. The number of character sets present in the database

Correct Answer: D

Explanation/Reference:

From scenario:
Business Requirements include: Expand services into Asia
Technical Requirements include: Decrease latency in Asia

QUESTION 7

You want to automate the creation of a managed instance group and a startup script to install the OS
package dependencies. You want to minimize the startup time for VMs in the instance group. What
should you do?
A. Use Terraform to create the managed instance group and a startup script to install the OS
package dependencies.
B. Create a custom VM image with all OS package dependencies. Use Deployment Manager to create
the managed instance group with the VM image.
C. Use Puppet to create the managed instance group and install the OS package dependencies.

https://www.certsquestions.com/Professional-Cloud-Architect-pdf-dumps.html
D. Use Deployment Manager to create the managed instance group and Ansible to install the OS
package dependencies.

Correct Answer: D

QUESTION 8

For this question, refer to the Dress4Win case study. You are responsible for the security of data
stored in Cloud Storage for your company, Dress4Win. You have already created a set of Google
Groups and assigned the appropriate users to those groups. You should use Google best practices and
implement the simplest design to meet the requirements. Considering Dress4Win’s business and
technical requirements, what should you do?
A. Assign custom IAM roles to the Google Groups you created in order to enforce security
requirements.Encrypt data with a customer-supplied encryption key when storing files in Cloud
Storage.
B. Assign custom IAM roles to the Google Groups you created in order to enforce security
requirements. Enable default storage encryption before storing files in Cloud Storage.
C. Assign predefined IAM roles to the Google Groups you created in order to enforce security
requirements. Utilize Google’s default encryption at rest when storing files in Cloud Storage.
D. Assign predefined IAM roles to the Google Groups you created in order to enforce security
requirements. Ensure that the default Cloud KMS key is set before storing files in Cloud Storage.

Correct Answer: A

QUESTION 9

For this question, refer to the TerramEarth case study. TerramEarth has equipped unconnected trucks
with servers and sensors to collet telemetry dat a. Next year they want to use the data to train
machine learning models. They want to store this data in the cloud while reducing costs. What should
they do?
A. Have the vehicle’ computer compress the data in hourly snapshots, and store it in a Google Cloud
storage (GCS) Nearline bucket.
B. Push the telemetry data in Real-time to a streaming dataflow job that compresses the data, and
store it in Google BigQuery.
C. Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and
store it in Cloud Bigtable.
D. Have the vehicle's computer compress the data in hourly snapshots, a Store it in a GCS Coldline
bucket.

https://www.certsquestions.com/Professional-Cloud-Architect-pdf-dumps.html
Correct Answer: D

Explanation/Reference:

Storage is the best choice for data that you plan to access at most once a year, due to its slightly
lower availability, 90-day minimum storage duration, costs for data access, and higher per-operation
costs. For example:
Cold Data Storage - Infrequently accessed data, such as data stored for legal or regulatory reasons,
can be stored at low cost as Coldline Storage, and be available when you need it.
Disaster recovery - In the event of a disaster recovery event, recovery time is key. Cloud Storage
provides low latency access to data stored as Coldline Storage.
References: https://cloud.google.com/storage/docs/storage-classes

QUESTION 10

You have an application that makes HTTP requests to Cloud Storage. Occasionally the requests fail
with
HTTP status codes of 5xx and 429.
How should you handle these types of errors?
A. Use gRPC instead of HTTP for better performance.
B. Implement retry logic using a truncated exponential backoff strategy.
C. Make sure the Cloud Storage bucket is multi-regional for geo-redundancy.
D. Monitor https://status.cloud.google.com/feed.atom and only make requests if Cloud Storage is
notreportingan incident.

Correct Answer: B

Explanation/Reference:

Reference https://cloud.google.com/storage/docs/json_api/v1/status-codes

QUESTION 11

For this question, refer to the Dress4Win case study. Dress4Win has configured a new uptime check
with Google Stackdriver for several of their legacy services. The Stackdriver dashboard is not
reporting the services as healthy. What should they do?
A. Install the Stackdriver agent on all of the legacy web servers.
B. In the Cloud Platform Console download the list of the uptime servers' IP addresses and create an
inbound firewall rule
C. Configure their load balancer to pass through the User-Agent HTTP header when the value
matches GoogleStackdriverMonitoring-UptimeChecks (https://cloud.google.com/monitoring)
D. Configure their legacy web servers to allow requests that contain user-Agent HTTP header when
the value matches GoogleStackdriverMonitoring— UptimeChecks
(https://cloud.google.com/monitoring)

https://www.certsquestions.com/Professional-Cloud-Architect-pdf-dumps.html
Correct Answer: D

QUESTION 12

For this question, refer to the Mountkirk Games case study. Mountkirk Games' gaming servers are not
automatically scaling properly. Last month, they rolled out a new feature, which suddenly became
very popular. A record number of users are trying to use the service, but many of them are getting
503 errors and very slow response times. What should they investigate first?
A. Verify that the database is online.
B. Verify that the project quota hasn't been exceeded.
C. Verify that the new feature code did not introduce any performance bugs.
D. Verify that the load-testing team is not running their tool against production.

Correct Answer: B

Explanation/Reference:

503 is service unavailable error. If the database was online everyone would get the 503 error.

QUESTION 13

For this question, refer to the Mountkirk Games case study. Mountkirk Games wants you to design
their new testing strategy. How should the test coverage differ from their existing backends on the
other platforms?
A. Tests should scale well beyond the prior approaches.
B. Unit tests are no longer required, only end-to-end tests.
C. Tests should be applied after the release is in the production environment.
D. Tests should include directly testing the Google Cloud Platform (GCP) infrastructure.

Correct Answer: A

Explanation/Reference:

From Scenario:
A few of their games were more popular than expected, and they had problems scaling their
application servers, MySQL databases, and analytics tools.
Requirements for Game Analytics Platform include: Dynamically scale up or down based on game
activity

https://www.certsquestions.com/Professional-Cloud-Architect-pdf-dumps.html
QUESTION 14

You need to develop procedures to test a disaster plan for a mission-critical application. You want to
use
Google-recommended practices and native capabilities within GCP.
What should you do?
A. Use Deployment Manager to automate service provisioning. Use Activity Logs to monitor and
debug your tests.
B. Use Deployment Manager to automate provisioning. Use Stackdriver to monitor and debug your
tests.
C. Use gcloud scripts to automate service provisioning. Use Activity Logs monitor and debug your
tests.
D. Use automated scripts to automate service provisioning. Use Activity Logs monitor and debug
your tests.

Correct Answer: A

QUESTION 15

For this question, refer to the Dress4Win case study. At Dress4Win, an operations engineer wants to
create a tow-cost solution to remotely archive copies of database backup files. The database files are
compressed tar files stored in their current data center. How should he proceed?
A. Create a cron script using gsutil to copy the files to a Coldline Storage bucket.
B. Create a cron script using gsutil to copy the files to a Regional Storage bucket.
C. Create a Cloud Storage Transfer Service Job to copy the files to a Coldline Storage bucket.
D. Create a Cloud Storage Transfer Service job to copy the files to a Regional Storage bucket.

Correct Answer: A

Explanation/Reference:

Follow these rules of thumb when deciding whether to use gsutil or Storage Transfer Service:
 When transferring data from an on-premises location, use gsutil.
 When transferring data from another cloud storage provider, use Storage Transfer Service.
 Otherwise, evaluate both tools with respect to your specific scenario.
Use this guidance as a starting point. The specific details of your transfer scenario will also help
you determine which tool is more appropriate

https://www.certsquestions.com/Professional-Cloud-Architect-pdf-dumps.html

You might also like