Google Cloud Architect Exam Discussions
Google Cloud Architect Exam Discussions
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 1
Topic #: 6
For this question, refer to the Mountkirk Games case study. Mountkirk Games wants to migrate from their current analytics and statistics
reporting model to one that meets their technical requirements on Google Cloud Platform.
Which two steps should be part of their migration plan? (Choose two.)
A. Evaluate the impact of migrating their current batch ETL code to Cloud Dataflow.
B. Write a schema migration plan to denormalize data for better performance in BigQuery.
C. Draw an architecture diagram that shows how to move from a single MySQL database to a MySQL cluster.
D. Load 10 TB of analytics data from a previous game into a Cloud SQL instance, and run test queries against the full dataset to confirm
E. Integrate Cloud Armor to defend against possible SQL injection attacks in analytics files uploaded to Cloud Storage.
Comments
sri007 Highly Voted 3 years, 12 months ago
Correct Answer A, B
Evaluate the impact of migrating their current batch ETL code to Cloud Dataflow
Write a schema migration plan to denormalize data for better performance in BigQuery.
agree AB
upvoted 12 times
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 2
Topic #: 6
For this question, refer to the Mountkirk Games case study. You need to analyze and define the technical architecture for the compute
workloads for your company, Mountkirk Games. Considering the Mountkirk Games business and technical requirements, what should you do?
C. Create a global load balancer with managed instance groups and autoscaling policies. Use preemptible Compute Engine instances.
D. Create a global load balancer with managed instance groups and autoscaling policies. Use non-preemptible Compute Engine instances.
Comments
dabrat Highly Voted 4 years, 1 month ago
Agree "D". Preemptible VM is suitable for app which is fault-tolerant. Termination of preemptive VM might affect gaming experience, so it is not
a good choice.
upvoted 18 times
Selected Answer: D
D is correct in my opinion.
upvoted 1 times
- Expert Verified, Online, Free.
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 3
Topic #: 6
For this question, refer to the Mountkirk Games case study. Mountkirk Games wants to design their solution for the future in order to take
advantage of cloud and technology improvements as they become available. Which two steps should they take? (Choose two.)
A. Store as much analytics and game activity data as financially feasible today so it can be used to train machine learning models to
B. Begin packaging their game backend artifacts in container images and running them on Google Kubernetes Engine to improve the
C. Set up a CI/CD pipeline using Jenkins and Spinnaker to automate canary deployments and improve development velocity.
D. Adopt a schema versioning tool to reduce downtime when adding new game features that require storing additional player data in the
database.
E. Implement a weekly rolling maintenance process for the Linux virtual machines so they can apply critical kernel patches and package
Comments
dabrat Highly Voted 4 years, 1 month ago
A+B)
=>as well as other metrics that provide deeper insight into usage patterns so we can adapt the game to target users.
environment that provides autoscaling, low latency load balancing, and frees us up from managing physical servers.
upvoted 80 times
Having a CI/CD pipeline means you can deploy changes to environments faster. Does this help you take advantage of cloud and technology
improvements as they become available in the future? Yes. When new features become available, you can incorporate them into your
application and deploy them to test/production environments easily/efficiently and decrease the time to go live.
Store more data and use it as training data for machine learning. is the right answer.
- Expert Verified, Online, Free.
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 4
Topic #: 6
For this question, refer to the Mountkirk Games case study. Mountkirk Games wants you to design a way to test the analytics platform's
A. Deploy failure injection software to the game analytics platform that can inject additional latency to mobile client analytics traffic.
B. Build a test client that can be run from a mobile phone emulator on a Compute Engine virtual machine, and run multiple copies in
Google Cloud Platform regions all over the world to generate realistic traffic.
C. Add the ability to introduce a random amount of delay before beginning to process analytics files uploaded from mobile devices.
D. Create an opt-in beta of the game that runs on players' mobile devices and collects response times from analytics endpoints running in
Comments
a66030 Highly Voted 3 years, 2 months ago
The answer is A. The question asks - test the analytics platform's resilience to changes in mobile network latency.
Only A adds latency to mobile network.
C - adds delay at beginning of file processing. does not add delay/latency in mobile network.
One of the lines in the requirements for analytics: Process data that arrives late because of slow mobile networks
upvoted 37 times
Add the ability to introduce a random amount of delay before beginning to process analytics files uploaded from mobile devices.
upvoted 28 times
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 5
Topic #: 6
For this question, refer to the Mountkirk Games case study. You need to analyze and define the technical architecture for the database
workloads for your company, Mountkirk Games. Considering the business and technical requirements, what should you do?
A. Use Cloud SQL for time series data, and use Cloud Bigtable for historical data queries.
B. Use Cloud SQL to replace MySQL, and use Cloud Spanner for historical data queries.
C. Use Cloud Bigtable to replace MySQL, and use BigQuery for historical data queries.
D. Use Cloud Bigtable for time series data, use Cloud Spanner for transactional data, and use BigQuery for historical data queries.
Comments
misho Highly Voted 3 years, 7 months ago
For the people who say it's C in Linux Academy, did you see the Technical requirements there? The old Technical Requirements have the line
"Connect to a managed NoSQL database service" but in the Technical Requirements in Google official site and in this question the line is
replaced if the following 2 lines "Connect to a transactional database service to manage user profiles and game state
Store game activity in a timeseries database service for future analysis". And for them definitely D is the answer!
upvoted 65 times
Correct Answer D
Use Cloud Bigtable for time series data, use Cloud Spanner for transactional data, and use BigQuery for historical data queries.
Storing time-series data in Cloud Bigtable is a natural fit, Cloud Spanner scales horizontally and serves data with low latency while maintaining
transactional consistency and industry-leading 99.999% (five 9s) availability - 10x less downtime than four nines (<5 minutes per year). Cloud
Spanner helps future-proof your database backend. After you load your data into BigQuery, you can query the data in your tables. BigQuery
supports two types of queries: Interactive queries, Batch queries
upvoted 40 times
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 6
Topic #: 6
For this question, refer to the Mountkirk Games case study. Which managed storage option meets Mountkirk's technical requirement for
A. Cloud Bigtable
B. Cloud Spanner
C. BigQuery
D. Cloud Datastore
Comments
Eroc Highly Voted 4 years, 2 months ago
Correct Answer A
Cloud Bigtable
Selected Answer: A
Google Bigtable is a fully managed, scalable NoSQL database service for large analytical and operational workloads.
upvoted 1 times
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 1
Topic #: 7
You need to optimize batch file transfers into Cloud Storage for Mountkirk Games' new Google Cloud solution. The batch files contain game
statistics that need to be staged in Cloud Storage and be processed by an extract transform load (ETL) tool. What should you do?
Comments
kopper2019 Highly Voted 2 years, 6 months ago
hey guys new Qs posted as of July 12th, 2021, All 21 new Qs in Question #152
upvoted 13 times
Selected Answer: B
I voted A.
I understand the impulse to choose B, parallel upload. But remember that this is the first step in an ETL process that, as I understand it, when a
file is uploaded an extraction process starts and then the load. If multiple files finish uploading at the same time, the extraction and loading
process get triggered in parallel too and can cause errors if they are not prepared to handle this type of process.
Anyway, now that I read the question again, I don't see that it specifies any kind of automation like I was thinking, so B is probably fine in this
case...
upvoted 1 times
I change my mind, is B!
- Expert Verified, Online, Free.
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 2
Topic #: 7
You are implementing Firestore for Mountkirk Games. Mountkirk Games wants to give a new game programmatic access to a legacy game's
Firestore database.
A. Create a service account (SA) in the legacy game's Google Cloud project, add a second SA in the new game's IAM page, and then give
B. Create a service account (SA) in the legacy game's Google Cloud project, give the SA the Organization Admin role, and then give it the
C. Create a service account (SA) in the legacy game's Google Cloud project, add this SA in the new game's IAM page, and then give it the
D. Create a service account (SA) in the legacy game's Google Cloud project, give it the Firebase Admin role, and then migrate the new
Comments
MamthaSJ Highly Voted 2 years, 6 months ago
Answer is C
upvoted 15 times
I think it should not simply give out the Organization admin role so A and B is out. We should not migrate the new game to the lagacy game's
project and thus D is out. So remain C is the only choice.
upvoted 9 times
Selected Answer: C
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 3
Topic #: 7
Mountkirk Games wants to limit the physical location of resources to their operating Google Cloud regions. What should you do?
C. Configure the quotas for resources in the regions not being used to 0.
D. Configure a custom alert in Cloud Monitoring so you can disable resources as they are created in other regions.
Comments
MamthaSJ Highly Voted 2 years, 6 months ago
Answer is A
upvoted 29 times
A is correct .
You can limit the physical location of a new resource with the Organization Policy Service resource locations constraint. You can use the location
property of a resource to identify where it is deployed and maintained by the service. For data-containing resources of some Google Cloud
services, this property also reflects the location where data is stored. This constraint allows you to define the allowed Google Cloud locations
where the resources for supported services in your hierarchy can be created.
After you define resource locations, this limitation will apply only to newly-created resources. Resources you created before setting the resource
locations constraint will continue to exist and perform their function.
https://cloud.google.com/resource-manager/docs/organization-policy/defining-locations
upvoted 28 times
Selected Answer: A
A
Proactive step is better than reactive step. A is better than C.
upvoted 1 times
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 1
Topic #: 8
TerramEarth's CTO wants to use the raw data from connected vehicles to help identify approximately when a vehicle in the field will have a
catastrophic failure.
A.
B.
C.
D.
Comments
hems4all Highly Voted 3 years, 2 months ago
A is correct
As described in the Designing a Connected Vehicle Platform on Cloud IoT Core case study,
1. Google Cloud Dataflow is essential to transform, enrich and then store telemetry data by using distributed data pipelines
2. Cloud Pub/Sub is essential to handle the streams of vehicle data while at the same time decoupling the specifics of the backend processing
implementation
For the first point, there is no doubt that BigQuery is the preferred choice for analytics. Cloud SQL does not scale to this sort of data volume
(9TB/day + data coming through when vehicles are serviced).
For the second point, GKE with Cloud Load Balancing is a better fit than App Engine. App Engine is a regional service whereas, with the other
option, you can have multiple GKE clusters in different regions. And Cloud Load Balancing can send requests to the cluster in the region that is
closest to the vehicle. This option minimizes the latency and makes the feedback loop more real-time.
upvoted 49 times
Ans should be A
upvoted 13 times
- Expert Verified, Online, Free.
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 2
Topic #: 8
The TerramEarth development team wants to create an API to meet the company's business requirements. You want the development team to
focus their development effort on business value versus creating a custom framework.
A. Use Google App Engine with Google Cloud Endpoints. Focus on an API for dealers and partners
B. Use Google App Engine with a JAX-RS Jersey Java-based framework. Focus on an API for the public
C. Use Google App Engine with the Swagger (Open API Specification) framework. Focus on an API for the public
D. Use Google Container Engine with a Django Python container. Focus on an API for the public
E. Use Google Container Engine with a Tomcat container with the Swagger (Open API Specification) framework. Focus on an API for
Comments
kvokka Highly Voted 3 years, 11 months ago
agree with A
upvoted 32 times
Google offers Cloud Endpoint to develop, deploy and manage APIs on any google cloud backend.
https://cloud.google.com/endpoints
With Endpoints Frameworks, you don't have to deploy a third-party web server (such as Apache Tomcat or Gunicorn) with your application. You
annotate or decorate the code and deploy your application as you normally would to the App Engine standard environment.
Cloud Endpoints Frameworks for the App Engine standard environment : https://cloud.google.com/endpoints/docs/frameworks/about-cloud-
endpoints-frameworks
upvoted 11 times
Not sure why these questions are using the term Google Container Engine instead of Google Kubernetes Engine. That is so confusing
upvoted 1 times
- Expert Verified, Online, Free.
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 3
Topic #: 8
Your development team has created a structured API to retrieve vehicle data. They want to allow third parties to develop tools for dealerships
that use this vehicle event data. You want to support delegated authorization against this data.
C. Restrict data access based on the source IP address of the partner systems
D. Create secondary credentials for each dealer that can be given to the trusted third party
Comments
ravisar Highly Voted 2 years, 1 month ago
Both can be used with SSO (Single sign on). SAML is for users and OAuth is more for applications.
Answer A
upvoted 40 times
Selected Answer: A
- Expert Verified, Online, Free.
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 4
Topic #: 8
TerramEarth plans to connect all 20 million vehicles in the field to the cloud. This increases the volume to 20 million 600 byte records a
Comments
jcmoranp Highly Voted 4 years, 2 months ago
Selected Answer: C
- Expert Verified, Online, Free.
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 5
Topic #: 8
You analyzed TerramEarth's business requirement to reduce downtime, and found that they can achieve a majority of time saving by reducing
customer's wait time for parts. You decided to focus on reduction of the 3 weeks aggregate reporting time.
A. Migrate from CSV to binary format, migrate from FTP to SFTP transport, and develop machine learning analysis of metrics
B. Migrate from FTP to streaming transport, migrate from CSV to binary format, and develop machine learning analysis of metrics
C. Increase fleet cellular connectivity to 80%, migrate from FTP to streaming transport, and develop machine learning analysis of metrics
D. Migrate from FTP to SFTP transport, develop machine learning analysis of metrics, and increase dealer local inventory by a fixed factor
Comments
shandy Highly Voted 4 years, 1 month ago
C is right choice because using cellular connectivity will greatly improve the freshness of data used for analysis from where it is now, collected
when the machines are in for maintenance. Streaming transport instead of periodic FTP will tighten the feedback loop even more. Machine
learning is ideal for predictive maintenance workloads.
A is not correct because machine learning analysis is a good means toward the end of reducing downtime, but shuffling formats and transport
doesn't directly help at all. B is not correct because machine learning analysis is a good means toward the end of reducing downtime, and
moving to streaming can improve the freshness of the information in that analysis, but changing the format doesn't directly help at all. D is not
correct because machine learning analysis is a good means toward the end of reducing downtime, but the rest of these changes don't directly
help at all.
upvoted 33 times
UNLIMITED ACCESS
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.
Google Discussions
Question #: 6
Topic #: 8
Which of TerramEarth's legacy enterprise processes will experience significant change as a result of increased Google Cloud Platform
adoption?
Comments
sri007 Highly Voted 3 years, 12 months ago
Correct Answer B
From the case study, it can conclude that Management (CXO) all concern rapid provision of resources (infrastructure) for growing as well as cost
management, such as Cost optimization in Infrastructure, trade up front capital expenditures (Capex) for ongoing operating expenditures
(Opex), and Total cost of ownership (TCO)
upvoted 27 times
Problem Statement:- Cloud Architecture for on-prem to cloud migration solution &
scale problem
Solution:
Infrastructure
Google’s IaaS and PaaS solution for data canters, Global VPC
Applications
Predictions — AI Platform
Databases
Monitoring
Cloud Logging — Automatically ingest audit and platform logs, manage retention
and policies.
Continuous Deployment
Helicopter Racing League (HRL) is a global sports league for competitive helicopter
racing. Each year HRL holds the world championship and several regional league
competitions where teams compete to earn a spot in the world championship. HRL
offers a paid service to stream the races all over the world with live telemetry and
predictions throughout each race.
Solution:
Transcoding
TV Box Telemetry
App Engine, Pub/Sub, Dataflow, BigQuery, Cloud Composer and Cloud Monitoring
increase telemetry and create additional insights.
Use Cloud CDN for delivering content with speed, efficiency and reliability closer
to the users.
Use PerfKit Benchmarker to get visibility into metrics like latency, throughput,
and jitter.
AI & ML
Analytics
Big Query streaming API and ML solution can create additional insights for
increasing fan engagements.
3. TerramEarth —
https://services.google.com/fh/files/blogs/master_case_study_terramearth.pdf
Solution:
Data Replication:
Stream critical data from vehicles to Cloud Bigtable to drive analytics in real time.
Open in app Sign up Sign in
Sensor Device -> HTTPS Gateway Device -> Pub/Sub -> Dataflow -> Bigtable ->
Search
GKE (Application & Presentation)
BigQuery partitioning by timestamp for Home base upload and unified analytics.
Data processing
Use Vertex AI for ML lifecycle to forecast anticipated stock needs to assist with
just-in-time repairs.
Cloud Operations:
Network Connectivity Center & Security Command Center for holistic security
view.
Apigee (X) to manage and monitor APIs, it create an abstraction layer to connect
to different interfaces.
Apigee Developer Portal lets you build a self service portal for internal and
external developers.
CICD:
Use Cloud Source Repository, Artifact Repository, Cloud Build for CICD
operations.
Remote Workforce:
4. Mountkirk Games —
https://services.google.com/fh/files/blogs/master_case_study_mountkirk_games.pdf
Solution:
Cloud Storage for storing game activity logs and analysed using BigQuery
BigQuery for storage and analytics; this can also contain the 10 TB historic data
Managed and Serverless services for dynamic scaling, minimal cost and
operations.
If you are interested in a classroom style tutorial of GCP services along with
architectural framework and best practices. Check out this course :
https://www.udemy.com/course/gcp-architect-am/?couponCode=GCP-OCT
Follow