0% found this document useful (0 votes)
95 views23 pages

Google Cloud Architect Exam Discussions

Uploaded by

gcpkhagani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
95 views23 pages

Google Cloud Architect Exam Discussions

Uploaded by

gcpkhagani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 6 QUESTION 1 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 1

Topic #: 6

[All Professional Cloud Architect Questions]

For this question, refer to the Mountkirk Games case study. Mountkirk Games wants to migrate from their current analytics and statistics

reporting model to one that meets their technical requirements on Google Cloud Platform.

Which two steps should be part of their migration plan? (Choose two.)

A. Evaluate the impact of migrating their current batch ETL code to Cloud Dataflow.

B. Write a schema migration plan to denormalize data for better performance in BigQuery.

C. Draw an architecture diagram that shows how to move from a single MySQL database to a MySQL cluster.

D. Load 10 TB of analytics data from a previous game into a Cloud SQL instance, and run test queries against the full dataset to confirm

that they complete successfully.

E. Integrate Cloud Armor to defend against possible SQL injection attacks in analytics files uploaded to Cloud Storage.

Show Suggested Answer

by  AWS56 at Jan. 13, 2020, 1:36 p.m.

Comments
  sri007 Highly Voted  3 years, 12 months ago

Correct Answer A, B

Evaluate the impact of migrating their current batch ETL code to Cloud Dataflow

Write a schema migration plan to denormalize data for better performance in BigQuery.

Stream processing (ETL) Dataflow and Reference https://cloud.google.com/bigquery/docs/loading-


data#loading_denormalized_nested_and_repeated_data
upvoted 31 times

  AWS56 Highly Voted  4 years ago

agree AB
upvoted 12 times

  tartar 3 years, 5 months ago


AB is ok
upvoted 8 times
- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 6 QUESTION 2 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 2

Topic #: 6

[All Professional Cloud Architect Questions]

For this question, refer to the Mountkirk Games case study. You need to analyze and define the technical architecture for the compute

workloads for your company, Mountkirk Games. Considering the Mountkirk Games business and technical requirements, what should you do?

A. Create network load balancers. Use preemptible Compute Engine instances.

B. Create network load balancers. Use non-preemptible Compute Engine instances.

C. Create a global load balancer with managed instance groups and autoscaling policies. Use preemptible Compute Engine instances.

D. Create a global load balancer with managed instance groups and autoscaling policies. Use non-preemptible Compute Engine instances.

Show Suggested Answer

by  dabrat at Nov. 18, 2019, 10:16 p.m.

Comments
  dabrat Highly Voted  4 years, 1 month ago

D) => KPI game stability = Use non-preemptible


upvoted 48 times

  tartar 3 years, 5 months ago


D is ok
upvoted 13 times

  nitinz 2 years, 10 months ago


has to be C, A & B does not meet SLA. D does not meet KPI.
upvoted 2 times

  KNG Highly Voted  3 years, 12 months ago

Agree "D". Preemptible VM is suitable for app which is fault-tolerant. Termination of preemptive VM might affect gaming experience, so it is not
a good choice.
upvoted 18 times

  someCloudUser Most Recent  10 months, 3 weeks ago

Selected Answer: D

D is correct in my opinion.
upvoted 1 times
- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 6 QUESTION 3 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 3

Topic #: 6

[All Professional Cloud Architect Questions]

For this question, refer to the Mountkirk Games case study. Mountkirk Games wants to design their solution for the future in order to take

advantage of cloud and technology improvements as they become available. Which two steps should they take? (Choose two.)

A. Store as much analytics and game activity data as financially feasible today so it can be used to train machine learning models to

predict user behavior in the future.

B. Begin packaging their game backend artifacts in container images and running them on Google Kubernetes Engine to improve the

ability to scale up or down based on game activity.

C. Set up a CI/CD pipeline using Jenkins and Spinnaker to automate canary deployments and improve development velocity.

D. Adopt a schema versioning tool to reduce downtime when adding new game features that require storing additional player data in the

database.

E. Implement a weekly rolling maintenance process for the Linux virtual machines so they can apply critical kernel patches and package

updates and reduce the risk of 0-day vulnerabilities.

Show Suggested Answer

by  dabrat at Nov. 18, 2019, 10:20 p.m.

Comments
  dabrat Highly Voted  4 years, 1 month ago

A+B)
=>as well as other metrics that provide deeper insight into usage patterns so we can adapt the game to target users.

environment that provides autoscaling, low latency load balancing, and frees us up from managing physical servers.
upvoted 80 times

  techalik 3 years, 1 month ago


Enable CI/CD integration to improve deployment velocity, agility and reaction to change. is the right answer.

Having a CI/CD pipeline means you can deploy changes to environments faster. Does this help you take advantage of cloud and technology
improvements as they become available in the future? Yes. When new features become available, you can incorporate them into your
application and deploy them to test/production environments easily/efficiently and decrease the time to go live.

Store more data and use it as training data for machine learning. is the right answer.
- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 6 QUESTION 4 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 4

Topic #: 6

[All Professional Cloud Architect Questions]

For this question, refer to the Mountkirk Games case study. Mountkirk Games wants you to design a way to test the analytics platform's

resilience to changes in mobile network latency. What should you do?

A. Deploy failure injection software to the game analytics platform that can inject additional latency to mobile client analytics traffic.

B. Build a test client that can be run from a mobile phone emulator on a Compute Engine virtual machine, and run multiple copies in

Google Cloud Platform regions all over the world to generate realistic traffic.

C. Add the ability to introduce a random amount of delay before beginning to process analytics files uploaded from mobile devices.

D. Create an opt-in beta of the game that runs on players' mobile devices and collects response times from analytics endpoints running in

Google Cloud Platform regions all over the world.

Show Suggested Answer

by  sri007 at Jan. 17, 2020, 7:43 a.m.

Comments
  a66030 Highly Voted  3 years, 2 months ago

The answer is A. The question asks - test the analytics platform's resilience to changes in mobile network latency.
Only A adds latency to mobile network.
C - adds delay at beginning of file processing. does not add delay/latency in mobile network.
One of the lines in the requirements for analytics: Process data that arrives late because of slow mobile networks
upvoted 37 times

  ShadowLord 1 year, 4 months ago


C is alright as well but just too specific to file upload
upvoted 2 times

  sri007 Highly Voted  3 years, 12 months ago


Correct Answer C

Add the ability to introduce a random amount of delay before beginning to process analytics files uploaded from mobile devices.
upvoted 28 times

  Ani26 3 years, 5 months ago


There is nothing mentioned about uploading analytical files from mobile devices - on analytical layer we need to perform resiliency test on
the latency changes..so A
- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 6 QUESTION 5 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 5

Topic #: 6

[All Professional Cloud Architect Questions]

For this question, refer to the Mountkirk Games case study. You need to analyze and define the technical architecture for the database

workloads for your company, Mountkirk Games. Considering the business and technical requirements, what should you do?

A. Use Cloud SQL for time series data, and use Cloud Bigtable for historical data queries.

B. Use Cloud SQL to replace MySQL, and use Cloud Spanner for historical data queries.

C. Use Cloud Bigtable to replace MySQL, and use BigQuery for historical data queries.

D. Use Cloud Bigtable for time series data, use Cloud Spanner for transactional data, and use BigQuery for historical data queries.

Show Suggested Answer

by  jcmoranp at Oct. 26, 2019, 11:41 a.m.

Comments
  misho Highly Voted  3 years, 7 months ago

For the people who say it's C in Linux Academy, did you see the Technical requirements there? The old Technical Requirements have the line
"Connect to a managed NoSQL database service" but in the Technical Requirements in Google official site and in this question the line is
replaced if the following 2 lines "Connect to a transactional database service to manage user profiles and game state
Store game activity in a timeseries database service for future analysis". And for them definitely D is the answer!
upvoted 65 times

  sri007 Highly Voted  3 years, 12 months ago

Correct Answer D

Use Cloud Bigtable for time series data, use Cloud Spanner for transactional data, and use BigQuery for historical data queries.

Storing time-series data in Cloud Bigtable is a natural fit, Cloud Spanner scales horizontally and serves data with low latency while maintaining
transactional consistency and industry-leading 99.999% (five 9s) availability - 10x less downtime than four nines (<5 minutes per year). Cloud
Spanner helps future-proof your database backend. After you load your data into BigQuery, you can query the data in your tables. BigQuery
supports two types of queries: Interactive queries, Batch queries
upvoted 40 times

  AdityaGupta 3 years, 2 months ago


I agree with above explanation and choice
upvoted 8 times
- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 6 QUESTION 6 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 6

Topic #: 6

[All Professional Cloud Architect Questions]

For this question, refer to the Mountkirk Games case study. Which managed storage option meets Mountkirk's technical requirement for

storing game activity in a time series database service?

A. Cloud Bigtable

B. Cloud Spanner

C. BigQuery

D. Cloud Datastore

Show Suggested Answer

by  jcmoranp at Oct. 26, 2019, 11:42 a.m.

Comments
  Eroc Highly Voted  4 years, 2 months ago

@jcmoranp , that is incorrect.. https://cloud.google.com/bigtable/docs/schema-design-time-series it's A


upvoted 27 times

  zbyszekz 2 years, 3 months ago


It is not clear, read technical requirements: "Store game activity logs in structured files for future analysis." so I think that D is a good option
upvoted 1 times

  sri007 Highly Voted  3 years, 12 months ago

Correct Answer A

Cloud Bigtable

Storing time series data in Cloud Bigtable https://cloud.google.com/bigtable/docs/schema-design-time-series


upvoted 19 times

  anirban7172 Most Recent  7 months ago

Selected Answer: A

Google Bigtable is a fully managed, scalable NoSQL database service for large analytical and operational workloads.
upvoted 1 times

  megumin 1 year, 2 months ago


- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 7 QUESTION 1 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 1

Topic #: 7

[All Professional Cloud Architect Questions]

You need to optimize batch file transfers into Cloud Storage for Mountkirk Games' new Google Cloud solution. The batch files contain game

statistics that need to be staged in Cloud Storage and be processed by an extract transform load (ETL) tool. What should you do?

A. Use gsutil to batch move files in sequence.

B. Use gsutil to batch copy the files in parallel.

C. Use gsutil to extract the files as the first part of ETL.

D. Use gsutil to load the files as the last part of ETL.

Show Suggested Answer

by  kopper2019 at July 3, 2021, 3:18 a.m.

Comments
  kopper2019 Highly Voted  2 years, 6 months ago

hey guys new Qs posted as of July 12th, 2021, All 21 new Qs in Question #152
upvoted 13 times

  victory108 Highly Voted  2 years, 6 months ago

B. Use gsutil to batch copy the files in parallel.


upvoted 13 times

  e5019c6 Most Recent  2 weeks, 6 days ago

Selected Answer: B

I voted A.
I understand the impulse to choose B, parallel upload. But remember that this is the first step in an ETL process that, as I understand it, when a
file is uploaded an extraction process starts and then the load. If multiple files finish uploading at the same time, the extraction and loading
process get triggered in parallel too and can cause errors if they are not prepared to handle this type of process.
Anyway, now that I read the question again, I don't see that it specifies any kind of automation like I was thinking, so B is probably fine in this
case...
upvoted 1 times

  odacir 1 month, 3 weeks ago


Selected Answer: B

I change my mind, is B!
- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 7 QUESTION 2 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 2

Topic #: 7

[All Professional Cloud Architect Questions]

You are implementing Firestore for Mountkirk Games. Mountkirk Games wants to give a new game programmatic access to a legacy game's

Firestore database.

Access should be as restricted as possible. What should you do?

A. Create a service account (SA) in the legacy game's Google Cloud project, add a second SA in the new game's IAM page, and then give

the Organization Admin role to both SAs.

B. Create a service account (SA) in the legacy game's Google Cloud project, give the SA the Organization Admin role, and then give it the

Firebase Admin role in both projects.

C. Create a service account (SA) in the legacy game's Google Cloud project, add this SA in the new game's IAM page, and then give it the

Firebase Admin role in both projects.

D. Create a service account (SA) in the legacy game's Google Cloud project, give it the Firebase Admin role, and then migrate the new

game to the legacy game's project.

Show Suggested Answer

by  kopper2019 at July 3, 2021, 3:18 a.m.

Comments
  MamthaSJ Highly Voted  2 years, 6 months ago

Answer is C
upvoted 15 times

  WFCheong Highly Voted  1 year ago

I think it should not simply give out the Organization admin role so A and B is out. We should not migrate the new game to the lagacy game's
project and thus D is out. So remain C is the only choice.
upvoted 9 times

  thewalker Most Recent  1 month, 2 weeks ago

Selected Answer: C

C is the best of the options provided.


upvoted 1 times

  gonlafer 1 year, 1 month ago


- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 7 QUESTION 3 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 3

Topic #: 7

[All Professional Cloud Architect Questions]

Mountkirk Games wants to limit the physical location of resources to their operating Google Cloud regions. What should you do?

A. Configure an organizational policy which constrains where resources can be deployed.

B. Configure IAM conditions to limit what resources can be configured.

C. Configure the quotas for resources in the regions not being used to 0.

D. Configure a custom alert in Cloud Monitoring so you can disable resources as they are created in other regions.

Show Suggested Answer

by  XDevX at July 1, 2021, 11:33 a.m.

Comments
  MamthaSJ Highly Voted  2 years, 6 months ago

Answer is A
upvoted 29 times

  muhasinem Highly Voted  2 years, 6 months ago

A is correct .
You can limit the physical location of a new resource with the Organization Policy Service resource locations constraint. You can use the location
property of a resource to identify where it is deployed and maintained by the service. For data-containing resources of some Google Cloud
services, this property also reflects the location where data is stored. This constraint allows you to define the allowed Google Cloud locations
where the resources for supported services in your hierarchy can be created.

After you define resource locations, this limitation will apply only to newly-created resources. Resources you created before setting the resource
locations constraint will continue to exist and perform their function.
https://cloud.google.com/resource-manager/docs/organization-policy/defining-locations
upvoted 28 times

  thewalker Most Recent  1 month, 2 weeks ago

Selected Answer: A

A
Proactive step is better than reactive step. A is better than C.
upvoted 1 times

  Majosh 2 months, 2 weeks ago


- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 8 QUESTION 1 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 1

Topic #: 8

[All Professional Cloud Architect Questions]

TerramEarth's CTO wants to use the raw data from connected vehicles to help identify approximately when a vehicle in the field will have a

catastrophic failure.

You want to allow analysts to centrally query the vehicle data.

Which architecture should you recommend?

A.

B.
C.
D.

Show Suggested Answer

by  MJK at Dec. 30, 2019, 12:17 p.m.

Comments
  hems4all Highly Voted  3 years, 2 months ago
A is correct

As described in the Designing a Connected Vehicle Platform on Cloud IoT Core case study,

1. Google Cloud Dataflow is essential to transform, enrich and then store telemetry data by using distributed data pipelines

2. Cloud Pub/Sub is essential to handle the streams of vehicle data while at the same time decoupling the specifics of the backend processing
implementation

It now comes down to a choice between

1. Cloud SQL vs BigQuery for analytics.

2. GKE (with or without Anthos) + Cloud Load balancing vs App Engine.

For the first point, there is no doubt that BigQuery is the preferred choice for analytics. Cloud SQL does not scale to this sort of data volume
(9TB/day + data coming through when vehicles are serviced).

For the second point, GKE with Cloud Load Balancing is a better fit than App Engine. App Engine is a regional service whereas, with the other
option, you can have multiple GKE clusters in different regions. And Cloud Load Balancing can send requests to the cluster in the region that is
closest to the vehicle. This option minimizes the latency and makes the feedback loop more real-time.
upvoted 49 times

  MJK Highly Voted  4 years ago

Ans should be A
upvoted 13 times
- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 8 QUESTION 2 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 2

Topic #: 8

[All Professional Cloud Architect Questions]

The TerramEarth development team wants to create an API to meet the company's business requirements. You want the development team to

focus their development effort on business value versus creating a custom framework.

Which method should they use?

A. Use Google App Engine with Google Cloud Endpoints. Focus on an API for dealers and partners

B. Use Google App Engine with a JAX-RS Jersey Java-based framework. Focus on an API for the public

C. Use Google App Engine with the Swagger (Open API Specification) framework. Focus on an API for the public

D. Use Google Container Engine with a Django Python container. Focus on an API for the public

E. Use Google Container Engine with a Tomcat container with the Swagger (Open API Specification) framework. Focus on an API for

dealers and partners

Show Suggested Answer

by  MJK at Dec. 30, 2019, 12:20 p.m.

Comments
  kvokka Highly Voted  3 years, 11 months ago

agree with A
upvoted 32 times

  Vika Highly Voted  2 years, 10 months ago

Google offers Cloud Endpoint to develop, deploy and manage APIs on any google cloud backend.
https://cloud.google.com/endpoints

With Endpoints Frameworks, you don't have to deploy a third-party web server (such as Apache Tomcat or Gunicorn) with your application. You
annotate or decorate the code and deploy your application as you normally would to the App Engine standard environment.

Cloud Endpoints Frameworks for the App Engine standard environment : https://cloud.google.com/endpoints/docs/frameworks/about-cloud-
endpoints-frameworks
upvoted 11 times

  VSMu Most Recent  11 months, 2 weeks ago

Not sure why these questions are using the term Google Container Engine instead of Google Kubernetes Engine. That is so confusing
upvoted 1 times
- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 8 QUESTION 3 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 3

Topic #: 8

[All Professional Cloud Architect Questions]

Your development team has created a structured API to retrieve vehicle data. They want to allow third parties to develop tools for dealerships

that use this vehicle event data. You want to support delegated authorization against this data.

What should you do?

A. Build or leverage an OAuth-compatible access control system

B. Build SAML 2.0 SSO compatibility into your authentication system

C. Restrict data access based on the source IP address of the partner systems

D. Create secondary credentials for each dealer that can be given to the trusted third party

Show Suggested Answer

by  Ramrao14 at Nov. 30, 2019, 10:01 p.m.

Comments
  ravisar Highly Voted  2 years, 1 month ago

SAML is an authentication system.


OAuth is an authorization system.

Both can be used with SSO (Single sign on). SAML is for users and OAuth is more for applications.
Answer A
upvoted 40 times

  huyhoang8344 1 year, 4 months ago


SAML can do both authentication and authorization If I am not mistaken
But agree A should be the answer
upvoted 1 times

  AD2AD4 Highly Voted  3 years, 7 months ago

Final Decision to go with Option A.


Refer - https://cloud.google.com/docs/authentication
Good Read - https://cloud.google.com/blog/products/identity-security/identity-and-authentication-the-google-cloud-way
upvoted 25 times

  megumin Most Recent  1 year, 2 months ago

Selected Answer: A
- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 8 QUESTION 4 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 4

Topic #: 8

[All Professional Cloud Architect Questions]

TerramEarth plans to connect all 20 million vehicles in the field to the cloud. This increases the volume to 20 million 600 byte records a

second for 40 TB an hour.

How should you design the data ingestion?

A. Vehicles write data directly to GCS

B. Vehicles write data directly to Google Cloud Pub/Sub

C. Vehicles stream data directly to Google BigQuery

D. Vehicles continue to write data using the existing system (FTP)

Show Suggested Answer

by  KouShikyou at Oct. 16, 2019, 5:24 a.m.

Comments
  jcmoranp Highly Voted  4 years, 2 months ago

It's Pub/Sub, too much data streaming for Bigquery...


upvoted 39 times

  alexspam88 2 years, 7 months ago


Too much for pubsub either https://cloud.google.com/pubsub/quotas
upvoted 4 times

  Bill831231 2 years, 3 months ago


thanks for sharing the link, but seems pub/sub can handle more streaming data than bigquery. pub/sub 120,000,000 kB per minute (2
GB/s) in large regions, bigquery is 1GB/s
upvoted 6 times

  JoeShmoe Highly Voted  4 years, 2 months ago

Its B, it exceeds the streaming limit for BQ


upvoted 20 times

  Vesta1807 Most Recent  2 weeks, 3 days ago

Selected Answer: C
- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 8 QUESTION 5 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 5

Topic #: 8

[All Professional Cloud Architect Questions]

You analyzed TerramEarth's business requirement to reduce downtime, and found that they can achieve a majority of time saving by reducing

customer's wait time for parts. You decided to focus on reduction of the 3 weeks aggregate reporting time.

Which modifications to the company's processes should you recommend?

A. Migrate from CSV to binary format, migrate from FTP to SFTP transport, and develop machine learning analysis of metrics

B. Migrate from FTP to streaming transport, migrate from CSV to binary format, and develop machine learning analysis of metrics

C. Increase fleet cellular connectivity to 80%, migrate from FTP to streaming transport, and develop machine learning analysis of metrics

D. Migrate from FTP to SFTP transport, develop machine learning analysis of metrics, and increase dealer local inventory by a fixed factor

Show Suggested Answer

by  VenkatGCP1 at Nov. 20, 2019, 1:43 p.m.

Comments
  shandy Highly Voted  4 years, 1 month ago

C is right choice because using cellular connectivity will greatly improve the freshness of data used for analysis from where it is now, collected
when the machines are in for maintenance. Streaming transport instead of periodic FTP will tighten the feedback loop even more. Machine
learning is ideal for predictive maintenance workloads.

A is not correct because machine learning analysis is a good means toward the end of reducing downtime, but shuffling formats and transport
doesn't directly help at all. B is not correct because machine learning analysis is a good means toward the end of reducing downtime, and
moving to streaming can improve the freshness of the information in that analysis, but changing the format doesn't directly help at all. D is not
correct because machine learning analysis is a good means toward the end of reducing downtime, but the rest of these changes don't directly
help at all.
upvoted 33 times

  nick_name_1 10 months, 3 weeks ago


There are 20 million TerramEarth vehicles in operation ... Approximately 200,000 have cellular connectivity. So, you're saying for them to keep
cost low, increase cell phone bill from 0.01% connected to 80% connected? Statistical Analysis does not require such a large sample size. C
CANNOT BE RIGHT.
upvoted 2 times

  nick_name_1 10 months, 3 weeks ago


It's B.
- Expert Verified, Online, Free.

UNLIMITED ACCESS

Get Unlimited Contributor Access to the all ExamTopics Exams!

Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Get Unlimited Access

 Google Discussions

 EXAM PROFESSIONAL CLOUD ARCHITECT TOPIC 8 QUESTION 6 DISCUSSION

Actual exam question from Google's Professional Cloud Architect

Question #: 6

Topic #: 8

[All Professional Cloud Architect Questions]

Which of TerramEarth's legacy enterprise processes will experience significant change as a result of increased Google Cloud Platform

adoption?

A. Opex/capex allocation, LAN changes, capacity planning

B. Capacity planning, TCO calculations, opex/capex allocation

C. Capacity planning, utilization measurement, data center expansion

D. Data Center expansion, TCO calculations, utilization measurement

Show Suggested Answer

by  sri007 at Jan. 17, 2020, 8:07 a.m.

Comments
  sri007 Highly Voted  3 years, 12 months ago

Correct Answer B

Capacity planning, TCO calculations, opex/capex allocation

From the case study, it can conclude that Management (CXO) all concern rapid provision of resources (infrastructure) for growing as well as cost
management, such as Cost optimization in Infrastructure, trade up front capital expenditures (Capex) for ongoing operating expenditures
(Opex), and Total cost of ownership (TCO)
upvoted 27 times

  nick_name_1 10 months, 3 weeks ago


Only Issue I have w/ B is that they may currently be leasing owned compute, meaning that CapEx/OpEx considerations don't change.
upvoted 2 times

  tartar 3 years, 5 months ago


B is ok
upvoted 7 times

  nitinz 2 years, 10 months ago


B is correct.
upvoted 1 times
1. EHR Healthcare —
https://services.google.com/fh/files/blogs/master_case_study_ehr_healthcare.pdf

EHR Healthcare is a leading provider of electronic health record software to the


medical industry. EHR Healthcare provides their software as a service to multi-
national medical offices, hospitals, and insurance providers.

Problem Statement:- Cloud Architecture for on-prem to cloud migration solution &
scale problem

Solution:

Infrastructure

Google’s IaaS and PaaS solution for data canters, Global VPC

Multi-regional replication for DR

Hybrid Connectivity — Cloud Interconnect (99.99% uptime SLA), Cloud VPN

Dedicated Interconnect for high performance connection between on-premises


and GCP

Applications

Kubernetes Deployment — GKE

Cloud and On-premise container based environments management and


integration — Anthos

Future API based integration — Apigee

Predictions — AI Platform

Data Ingestion — Streaming (Pub/Sub), Batch (Cloud Storage)

Process — Dataflow, Cloud Composer

Databases

MySQL, MS SQL — Cloud SQL

Redis — Cloud Memorystore


MongoDB — Cloud Firestore

Monitoring

Cloud Monitoring — Alerts and notifications, Charts and dashboards

Cloud Logging — Automatically ingest audit and platform logs, manage retention
and policies.

Continuous Deployment

Use Terraform for Infrastructure as Code

Use Cloud Source Repositories for storing the source code

Use Cloud Build for deployment and orchestration.

Use Artifact Registry for container images

2. Helicopter Racing League-


https://services.google.com/fh/files/blogs/master_case_study_helicopter_racing_leag
ue.pdf

Helicopter Racing League (HRL) is a global sports league for competitive helicopter
racing. Each year HRL holds the world championship and several regional league
competitions where teams compete to earn a spot in the world championship. HRL
offers a paid service to stream the races all over the world with live telemetry and
predictions throughout each race.

Problem Statement:-Cloud AI & ML, telemetry and streaming problem

Solution:

Transcoding

Use Preemptible instance for VM based encoding solution

Containerise the encoding solution and manage using Kubernetes engine.

TV Box Telemetry
App Engine, Pub/Sub, Dataflow, BigQuery, Cloud Composer and Cloud Monitoring
increase telemetry and create additional insights.

Live Video Latency

Use HA configuration of Cloud VPN for connectivity between mobile data


centers and google cloud.

Use Cloud CDN for delivering content with speed, efficiency and reliability closer
to the users.

Use Cloud Storage with multi-regional buckets to serve contents.

Use PerfKit Benchmarker to get visibility into metrics like latency, throughput,
and jitter.

Google Cloud offers Network Intelligence Center for comprehensive and


proactive monitoring, troubleshooting, and optimization capabilities

AI & ML

Use AI platform for predictive efficiency

Use TensorFlow Deep Learning VM instances

Analytics

Use BigQuery as a data mart for processing of large volume of data.

Use Looker for embedded analytics.

Big Query streaming API and ML solution can create additional insights for
increasing fan engagements.

3. TerramEarth —
https://services.google.com/fh/files/blogs/master_case_study_terramearth.pdf

TerramEarth manufactures heavy equipment for the mining and agricultural


industries.. They currently have over 500 dealers and service centers in 100
countries. There are 2 million TerramEarth vehicles in operation currently, and they
see 20% yearly growth. Their mission is to build products that make their customers
more productive.
Problem Statement:- Cloud Automation, Operations and API Ecosystem problem

Solution:

Data Replication:

Stream critical data from vehicles to Cloud Bigtable to drive analytics in real time.
Open in app Sign up Sign in

Sensor Device -> HTTPS Gateway Device -> Pub/Sub -> Dataflow -> Bigtable ->
Search
GKE (Application & Presentation)

BigQuery partitioning by timestamp for Home base upload and unified analytics.

Data processing

Cloud Dataflow for serverless unified (batch & stream) ETL

ML Engine or AutoML Tables

Use Vertex AI for ML lifecycle to forecast anticipated stock needs to assist with
just-in-time repairs.

Vehicles — Home Base Connected

Device management & upload

Cloud IoT Core

IoT devices -> Cloud Pub/Sub

Cloud Dataflow -> Cloud Storage

Cloud Operations:

Managed and Serverless services

Network Intelligence Center for for monitoring, verification and optimization.

Network Connectivity Center & Security Command Center for holistic security
view.

Cloud Monitoring for real time visibility

Google Cloud KMS for Key Management


API’s ecosystem:

Apigee (X) to manage and monitor APIs, it create an abstraction layer to connect
to different interfaces.

Apigee Developer Portal lets you build a self service portal for internal and
external developers.

Build and Deploy API’s on Google Kubernetes Engine

CICD:

Use Cloud Source Repository, Artifact Repository, Cloud Build for CICD
operations.

Use Spinnaker to deploy on Kubernetes with Blue Green and Canary


deployments.

Remote Workforce:

G Suite with integrated Cloud IAM.

Cloud Data Loss Prevention for sensitive data protection.

Use Connected sheets with BigQuery to collaborate with integrated security


controls.

4. Mountkirk Games —
https://services.google.com/fh/files/blogs/master_case_study_mountkirk_games.pdf

Mountkirk Games makes online, session-based, multiplayer games for mobile


platforms. They have recently started expanding to other platforms after
successfully migrating their on-premises environments to Google Cloud. Mountkirk
Games is building a new multiplayer game that they expect to be very popular.

Problem Statement:- Cloud Auto-scaling, Gaming analytics Problem

Solution:

Cloud Spanner with relational features, horizontal scaling and 99.999%


availability across regions.
Google Kubernetes Engine(GKE) to deploy game’s backend as microservices

Google Load Balancing for worldwide seamless autoscaling

Pub/Sub, Dataflow and BigQuery for Stream analytics

Looker for player insights and analytics

GPUs hardware accelerators on GKE.

Cloud Datastore for transactional game state

Cloud Storage for storing game activity logs and analysed using BigQuery

Cloud Pub/Sub for buffering of live and late data

Cloud Dataflow for bulk and stream processing

BigQuery for storage and analytics; this can also contain the 10 TB historic data

Managed and Serverless services for dynamic scaling, minimal cost and
operations.

Cloud Operations metrics and APM functionality for proactive troubleshooting.

If you are interested in a classroom style tutorial of GCP services along with
architectural framework and best practices. Check out this course :
https://www.udemy.com/course/gcp-architect-am/?couponCode=GCP-OCT

Follow

Written by Ashutosh Mishra


21 Followers

Multi-cloud Architect & Advisor

You might also like