0% found this document useful (0 votes)
33 views70 pages

PCSE Workbook

The document provides guidance on configuring identity and access management in Google Cloud. It discusses migrating users from an on-premises Active Directory to Cloud Identity, creating groups to control access, and securely managing service accounts.

Uploaded by

ranjankaul
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views70 pages

PCSE Workbook

The document provides guidance on configuring identity and access management in Google Cloud. It discusses migrating users from an on-premises Active Directory to Cloud Identity, creating groups to control access, and securely managing service accounts.

Uploaded by

ranjankaul
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 70

Preparing for Your

Professional Cloud
Security Engineer
Journey

Course Workbook
Certification Exam Guide Sections
1 Configuring access within a cloud solution environment

2 Configuring network security

3 Ensuring data protection

4 Managing operations in a cloud solution environment

5 Ensuring compliance
Section 1:
Configuring access within a
cloud solution environment
1.1 Diagnostic Question 01
Cymbal Bank has acquired a
non-banking financial company A. Run Microsoft System Center Configuration Manager (SCCM) on a Compute Engine instance. Leave the
(NBFC). This NBFC uses Active channel unencrypted because you are in a secure Google Cloud environment. Deploy Google Cloud
Directory as their central directory Directory Sync on the Compute Engine instance. Connect to the on-premises Windows Server
on an on-premises Windows environment from the instance, and migrate users to Cloud Identity.
Server. You have been tasked with
B. Run Configuration Manager on a Compute Engine instance. Copy the resulting configuration file from this
migrating all the NBFC users and
machine onto a new Compute Engine instance to keep the production environment separate from the
employee information to Cloud staging environment. Leave the channel unencrypted because you are in a secure Google Cloud
Identity. environment. Deploy Google Cloud Directory Sync on this new instance. Connect to the on-premises
Windows Server environment from the new instance, and migrate users to Cloud Identity.
C. Use Cloud VPN to connect the on-premises network to your Google Cloud environment. Select an
on-premises domain-joined Windows Server. On the domain-joined Windows Server, run Configuration
What should you do?
Manager and Google Cloud Directory Sync. Use Cloud VPN’s encrypted channel to transfer users from
the on-premises Active Directory to Cloud Identity.
D. Select an on-premises domain-joined Windows Server. Run Configuration Manager on the domain-joined
Windows Server, and copy the resulting configuration file to a Compute Engine instance. Run Google
Cloud Directory Sync on the Compute Engine instance over the internet, and use Cloud VPN to sync
users from the on-premises Active Directory to Cloud Identity.
1.1 Diagnostic Question 02
Cymbal Bank has certain default
permissions and access for their analyst, A. Leave all user permissions as-is in the small bank’s IAM. Use the Directory
finance, and teller teams. These teams are API in the Google Workspace Admin SDK to create Google Groups. Use a
organized into groups that have a set of Python script to allocate users to the Google Groups.
role-based IAM permissions assigned to B. Reset all user permissions in the small bank’s IAM. Use Cloud Identity to
them. After a recent acquisition of a small create dynamic groups for each of the bank’s teams. Use the dynamic
bank, you find that the small bank directly groups’ metadata field for team type to allocate users to their appropriate
assigns permissions to their employees in group with a Python script.
IAM. You have been tasked with applying
Cymbal Bank’s organizational structure to C. Reset all user permissions in the small bank’s IAM. Use Cloud Identity to
the small bank. Employees will need access create the required Google Groups. Upgrade the Google Groups to
to Google Cloud services. Security Groups. Use a Python script to allocate users to the groups.
D. Reset all user permissions in the small bank’s IAM. Use the Directory API in
the Google Workspace Admin SDK to create Google Groups. Use a Python
What should you do?
script to allocate users to the groups.
Proprietary + Confidential

1.1 Configuring Cloud Identity

Courses Documentation
Active Directory user account provisioning |
Identity and access management | Google Cloud
What is Configuration Manager? - Google
Security in Google Cloud Managing Security in Google Cloud Workspace Admin Help
● M2 Cloud Identity ● M2 Cloud Identity
Manage membership automatically with dynamic
groups - Google Workspace Admin Help
Creating and updating a dynamic group | Cloud
Identity
Create and manage groups using APIs - Google
Workspace Admin Help
1.2 Diagnostic Question 03
Cymbal Bank leverages Google Cloud
storage services, an on-premises A. Create a service account with appropriate permissions. Authenticate the Spark
Apache Spark Cluster, and a web Cluster and the web application as direct requests and share the service
application hosted on a third-party account key.
cloud. The Spark cluster and web B. Create a service account with appropriate permissions. Have the Spark
application require limited access to Cluster and the web application authenticate as delegated requests, and share
Cloud Storage buckets and a Cloud the short-lived service account credential as a JWT.
SQL instance for only a few hours per
day. You have been tasked with C. Create a service account with appropriate permissions. Authenticate the Spark
sharing credentials while minimizing Cluster and the web application as a delegated request, and share the service
the risk that the credentials will be account key.
compromised. D. Create a service account with appropriate permissions. Have the Spark
Cluster and the web application authenticate as a direct request, and share the
short-lived service account credentials as XML tokens.
What should you do?
1.2 Diagnostic Question 04

Cymbal Bank recently discovered


service account key misuse in one
of the teams during a security audit. A. Navigate to Organizational policies in the Google Cloud Console. Select your organization.
As a precaution, going forward you Select iam.disableServiceAccountKeyCreation. Customize the applied to property,
do not want any team in your and set Enforcement to ‘On’. Click Save. Repeat the process for
organization to generate new iam.disableCrossProjectServiceAccountUsage.
external service account keys. You
B. Run the gcloud resource-manager org-policies enable-enforce command with
also want to restrict every new the constraints iam.disableServiceAccountKeyCreation, and
service account’s usage to its iam.disableCrossProjectServiceAccountUsage and the Project IDs you want the
associated Project. constraints to apply to.
C. Navigate to Organizational policies in the Google Cloud Console. Select your organization.
Select iam.disableServiceAccountKeyCreation. Under Policy Enforcement, select
Merge with parent. Click Save. Repeat the process for
iam.disableCrossProjectServiceAccountLienRemoval.
What should you do? D. Run the gcloud resource-manager org-policies allow command with the boolean
constraints iam.disableServiceAccountKeyCreation and
iam.disableCrossProjectServiceAccountUsage with Organization ID.
Proprietary + Confidential

1.2 Managing service accounts

Courses Skill Badges Documentation


Security in Google Cloud Creating short-lived service account credentials |
● M3 Identity and Access Cloud IAM Documentation
Management (IAM) Google Cloud Restricting service account usage | Resource
● M5 Securing Compute Engine:
Manager Documentation | Google Cloud
Techniques and Best Practices Ensure Access and
● M8 Securing Kubernetes: Identity in Google
Techniques and Best Practices
Cloud Quest
Managing Security in Google Cloud
● M3 Identity and Access
Management (IAM)
Security Best Practices in Google Cloud
● M1 Securing Compute Engine:
Techniques and Best Practices
● M4 Securing Kubernetes:
Techniques and Best Practices
1.3 Diagnostic Question 05
Cymbal Bank publishes its APIs
through Apigee. Cymbal Bank has
recently acquired ABC Corp, which A. Use openssl to generate public and private keys. Store the public key in an
uses a third-party identity provider. X.509 certificate, and encrypt using RSA or DSA for SAML. Sign in to the
You have been tasked with Google Admin console, and under Security, upload the certificate.
connecting ABC Corp’s identity B. Use openssl to generate a private key. Store the private key in an X.509
provider to Apigee for single sign-on certificate, and encrypt using AES or DES for SAML. Sign in to the Google
(SSO). You need to set up SSO so Workspace Admin Console and upload the certificate.
that Google is the service provider.
You also want to monitor and log C. Use openssl to generate public and private keys. Store the private key in an
high-risk activities. X.509 certificate, and encrypt using AES or DES for SAML. Sign in to the
Google Admin console, and under Security, upload the certificate.
D. Review Network mapping results, and assign SSO profiles to required
Which two choices would you users.
select to enable SSO?
E. Review Network mapping results, and assign SAML profiles to required
users.
1.3 Diagnostic Question 06

Cymbal Bank’s Mobile Development A. Create a custom role for auditors at the Organization level. Create a JSON file with
Team has an AI Platform instance in a required permissions ml.models.list and ml.jobs.list. Use gIAM roles create
Google Cloud Project. An auditor needs role-id -- organization organization-id --file=json-file-path.
to record the AI Platform jobs and B. Create a custom role for auditors at the Project level. Create a YAML file with
models, along with their usage. You need required permissions ml.models.list and ml.jobs.list. Use gIAM roles
to assign permissions to the external create role-id --project project-id --file=yaml-file-path.
auditors so that they can view the models
and jobs but not retrieve specific details C. Create a custom role for auditors at the Project level. Use gIAM roles create
on any of them. role-name --project project-id --permissions= ml.models.get,
ml.jobs.get.
D. Create a custom role for auditors at the Organization level. Create a JSON file with
required permissions ml.models.list and ml.jobs.list. Use gIAM role
What should you do?
create role-id --organization organization-id
--file=json-file-path.
Proprietary + Confidential

1.3 Managing authentication

Courses Skill Badges Documentation


Security in Google Cloud SAML overview | Apigee X | Google Cloud
● M2 Cloud Identity Set up single sign-on for managed Google
● M3 Identity and Access Google Cloud Accounts using third-party Identity providers -
Management (IAM) Google Workspace Admin Help
Ensure Access and
Identity in Google Assign SSO profile to organizational units or
Cloud Quest groups - Google Workspace Admin Help
Network Mapping results - Google Workspace
Managing Security in Google Cloud Admin Help
● M2 Cloud Identity
Creating and managing custom roles | Cloud
● M3 Identity and Access
IAM Documentation
Management (IAM)
Understanding IAM custom roles | Cloud IAM
Documentation | Google Cloud
Understanding roles | Cloud IAM Documentation
1.4 Diagnostic Question 07
Cymbal Bank’s organizational hierarchy divides the
Organization into departments. The Engineering Department A. Assign the Project Editor role in each individual project to the technical
has a ‘product team’ folder. This folder contains folders for product manager. Assign the Project Editor role in each individual project
each of the bank’s products. Each product folder contains to the web developer.
one Google Cloud Project, but more may be added. Each B. Assign the Project Owner role in each individual project to the technical
project contains an App Engine deployment. product manager. Assign the App Engine Deployer role in each
individual project to the web developer.
Cymbal Bank has hired a new technical product manager and
a new web developer. The technical product manager must C. Assign the Project Editor role at the Engineering Department folder level
be able to interact with and manage all services in projects to the technical product manager. Assign the App Engine Deployer role
that roll up to the Engineering Department folder. The web at the specific product’s folder level to the web developer.
developer needs read-only access to App Engine D. Assign the Project Editor role at the Engineering Department folder level
configurations and settings for a specific product. to the technical product manager. Create a Custom Role in the product
folder that the web developer needs access to. Add the
appengine.versions.create and appengine.versions.delete permissions
How should you provision the new employees’ roles into to that role, and assign it to the web developer.
your hierarchy following principles of least privilege?
1.4 Diagnostic Question 08
Cymbal Bank’s organizational hierarchy divides the
Organization into departments. The Engineering A. Create custom roles for all three user types at the “analytics” folder level. For the team
Department has a ‘product team’ folder. This folder lead, provide all appengine.* and cloudsql.* permissions. For the developer, provide
contains folders for each of the bank’s products. One appengine.applications.* and appengine.instances.* permissions. For the code reviewer,
folder titled “analytics” contains a Google Cloud provide the appengine.instances.* permissions.
Project that contains an App Engine deployment and B. Assign the basic ‘App Engine Admin’ and ‘Cloud SQL Admin” roles to the team lead.
a Cloud SQL instance. Assign the ‘App Engine Admin’ role to the developer. Assign the ‘App Engine Code
Viewer’ role to the code reviewer. Assign all these permissions at the analytics project
A team needs specific access to this project. The level.
team lead needs full administrative access to App
C. Create custom roles for all three user types at the project level. For the team lead,
Engine and Cloud SQL. A developer must be able to
provide all appengine.* and cloudsql.* permissions. For the developer, provide
configure and manage all aspects of App Engine
appengine.applications.* and appengine.instances.* permissions. For the code reviewer,
deployments. There is also a code reviewer who
provide the appengine.instances.* permissions.
may periodically review the deployed App Engine
source code without making any changes. D. Assign the basic ‘Editor’ role to the team lead. Create a custom role for the developer.
Provide all appengine.* permissions to the developer. Provide the predefined ‘App
What types of permissions would you
Engine Code Viewer’ role to the code reviewer. Assign all these permissions at the
provide to each of these users? “analytics” folder level.
Proprietary + Confidential

Managing and implementing


1.4 authorization controls

Courses Skill Badges Documentation


Security in Google Cloud Access control for projects with IAM | Resource
● M3 Identity and Access Manager Documentation | Google Cloud
Management (IAM) Google Cloud Access control for organizations with IAM |
Resource Manager Documentation | Google
Ensure Access and Cloud
Identity in Google
Managing Security in Google Cloud Cloud Quest Access control for folders with IAM | Resource
● M3 Identity and Access Manager Documentation | Google Cloud
Management (IAM) Understanding roles | Cloud IAM Documentation
Understanding roles | Cloud IAM Documentation
1.5 Diagnostic Question 09
Cymbal Bank is divided into separate
departments. Each department is divided
into teams. Each team works on a distinct A. Create an Organization node. Under the Organization node, create
product that requires Google Cloud Department folders. Under each Department, create Product folders. Under
resources for development. each Product, create Teams folders. In the Teams folder, add Projects.
B. Create an Organization node. Under the Organization node, create
Department folders. Under each Department, create Product folders. Add
Projects to the Product folders.
How would you design a Google C. Create an Organization node. Under the Organization node, create
Cloud organization hierarchy to best Department folders. Under each Department, create Teams folders. Add
match Cymbal Bank’s organization Projects to the Teams folders.
structure and needs?
D. Create an Organization node. Under the Organization node, create
Department folders. Under each Department, create a Teams folder. Under
each Team, create Product folders. Add Projects to the Product folders.
1.5 Diagnostic Question 10

Cymbal Bank has a team of


developers and administrators A. Deny Serial Port Access and Service Account Creation at the Organization level. Create an ‘admin’ folder
working on different sets of Google and set enforced: false for constraints/compute.disableSerialPortAccess. Create a new ‘dev’ folder inside the
‘admin’ folder, and set enforced: false for constraints/iam.disableServiceAccountCreation. Add developers to
Cloud resources. The Bank’s
the ‘dev’ folder, and add administrators to the ‘admin’ folder.
administrators should be able to
access the serial ports on B. Deny Serial Port Access and Service Account Creation at the organization level. Create a ‘dev’ folder and set
Compute Engine Instances and enforced: false for constraints/compute.disableSerialPortAccess. Create a new ‘admin’ folder inside the ‘dev’
folder, and set enforced: false for constraints/iam.disableServiceAccountCreation. Add developers to the
create service accounts.
‘dev’ folder, and add administrators to the ‘admin’ folder.
Developers should only be able to
access serial ports. C. Deny Serial Port Access and Service Account Creation at the organization level. Create a ‘dev’ folder and set
enforced: true for constraints/compute.disableSerialPortAccess and enforced: true for
constraints/iam.disableServiceAccountCreation. Create a new ‘admin’ folder inside the ‘dev’ folder, and set
enforced: false for constraints/iam.disableServiceAccountCreation. Add developers to the ‘dev’ folder, and
How would you design the add administrators to the ‘admin’ folder.
organization hierarchy to provide
the required access? D. Allow Serial Port Access and Service Account Creation at the organization level. Create a ‘dev’ folder and set
enforced: true for constraints/iam.disableServiceAccountCreation. Create another ‘admin’ folder that inherits
from the parent inside the organization node. Add developers to the ‘dev’ folder, and add administrators to
the ‘admin’ folder.
Proprietary + Confidential

1.5 Defining resource hierarchy

Courses Documentation
Understanding hierarchy evaluation | Resource
Manager Documentation | Google Cloud
Creating and managing organizations | Resource
Security in Google Cloud Managing Security in Google Cloud Manager Documentation | Google Cloud
● M2 Cloud Identity ● M2 Cloud Identity
● M3 Identity and Access Management ● M3 Identity and Access Management Best practices for enterprise organizations |
(IAM) (IAM) Documentation | Google Cloud
Section 2:
Configuring network security
Proprietary + Confidential

2.1 Diagnostic Question 01


Cymbal Bank has published an API that A. gcloud compute security-policies rules
internal teams will use through the create priority
HTTPS load balancer. You need to limit --security-policy sec-policy
--src-ip-ranges=source-range
the API usage to 200 calls every hour.
--action=throttle C. gcloud compute security-policies rules create priority
Any exceeding usage should inform the --rate-limit-threshold-count=200 --security-policy sec-policy
users that servers are busy. --rate-limit-threshold-interval-sec=3600 --src-ip-ranges=source-range
--conform-action=allow --action=rate-based-ban
--exceed-action=deny-429 --rate-limit-threshold-count=200
Which gcloud command would you --enforce-on-key=HTTP-HEADER --rate-limit-threshold-interval-sec=3600
run to throttle the load balancing for --conform-action=deny
the given specification? B. gcloud compute security-policies rules --exceed-action=deny-403
create priority --enforce-on-key=HTTP-HEADER
--security-policy sec-policy
--src-ip-ranges=source-range D. gcloud compute security-policies rules create priority
--action=throttle --security-policy sec-policy
--rate-limit-threshold-count=200 --src-ip-ranges="<source range>"
--rate-limit-threshold-interval-sec=60 --action=rate-based-ban
--conform-action=deny --rate-limit-threshold-count=200
--exceed-action=deny-404 --rate-limit-threshold-interval-sec=3600
--enforce-on-key=HTTP-HEADER --conform-action=allow
--exceed-action=deny-500
--enforce-on-key=IP
2.1 Diagnostic Question 02
Cymbal Bank is releasing a new loan
management application using a A. Create a Google-managed SSL certificate. Attach a global dynamic external IP address to the internal
Compute Engine managed instance HTTPS load balancer. Validate that an existing URL map will route the incoming service to your
group. External users will connect to managed instance group backend. Load your certificate and create an HTTPS proxy routing to your
the application using a domain name URL map. Create a global forwarding rule that routes incoming requests to the proxy.
or IP address protected with TLS 1.2.
B. Create a Google-managed SSL certificate. Attach a global static external IP address to the external
A load balancer already hosts this
HTTPS load balancer. Validate that an existing URL map will route the incoming service to your
application and preserves the source
managed instance group backend. Load your certificate and create an HTTPS proxy routing to your
IP address. You are tasked with
URL map. Create a global forwarding rule that routes incoming requests to the proxy.
setting up the SSL certificate for this
load balancer. C. Import a self-managed SSL certificate. Attach a global static external IP address to the TCP Proxy load
balancer. Validate that an existing URL map will route the incoming service to your managed instance
group backend. Load your certificate and create a TCP proxy routing to your URL map. Create a global
forwarding rule that routes incoming requests to the proxy.

What should you do? D. Import a self-managed SSL certificate. Attach a global static external IP address to the SSL Proxy load
balancer. Validate that an existing URL map will route the incoming service to your managed instance
group backend. Load your certificate and create an SSL proxy routing to your URL map. Create a
global forwarding rule that routes incoming requests to the proxy.
2.1 Diagnostic Question 03

Your organization has a website A. Set up Cloud VPN. Set up an unencrypted tunnel to one of the hosts in
running on Compute Engine. This the network. Create outbound or egress firewall rules. Use the private
instance only has a private IP IP address to log in using a gcloud ssh command.
address. You need to provide SSH B. Use SOCKS proxy over SSH. Set up an SSH tunnel to one of the
access to an on-premises hosts in the network. Create the SOCKS proxy on the client side.
developer who will debug the
C. Use the default VPC’s firewall. Open port 22 for TCP protocol using
website from the authorized
the Google Cloud Console.
on-premises location only.
D. Use Identity-Aware Proxy (IAP). Set up IAP TCP forwarding by
creating ingress firewall rules on port 22 for TCP using the gcloud
How do you enable this?
command.
Proprietary + Confidential

2.1 Designing network security

Courses Skill Badges Documentation


Networking in Google Cloud gcloud compute security-policies rules update |
● M2 Controlling Access to VPC Networks Cloud SDK Documentation
● M4 Load balancing
Google Cloud gcloud compute security-policies | Cloud SDK
Security in Google Cloud
● M4 Configuring VPC for Isolation and Security
Documentation
● M7 Application Security: Techniques and Best
Build and Secure
Practices Networks in Google Setting up an global external HTTP(S) load
● M9 Protecting Against DDoS Attacks Cloud Quest balancer (classic) with a Compute Engine
backend | Load Balancing | Google Cloud
Networking in Google Cloud: Defining and Using Google-managed SSL certificates | Load
implementing networks
Balancing
● M2 Controlling Access to VPC Networks
● M4 Load balancing Using IAP for TCP forwarding | Identity-Aware
Managing Security in Google Cloud Google Cloud Proxy | Google Cloud
● M4 Configuring VPC for Isolation and Security
Ensure Access and Securely connecting to VM instances | Compute
Security Best Practices in Google Cloud
● M3 Application Security: Techniques and Best Identity in Google Engine Documentation | Google Cloud
Practices
Cloud Quest
Mitigating Security Vulnerabilities in Google Cloud
● M1 Protecting Against DDoS Attacks
2.2 Diagnostic Question 04
Cymbal Bank has two engineering teams
(T1 and T2) working on two different A. Create a forwarding zone with P1 and P2’s VPCs in the VPC network list. Add
Projects (P1 and P2). Both P1 and P2 use P2’s IP addresses in the private forwarding targets list. Then enable DNSSEC
custom VPCs. T2 needs to request and with gcloud dns managed-zones update zone-name
verify DNS records for T1’s domain that -dnssec-state on.
are internal to P1’s Compute Engine B. Create a peering zone. Set P1 as producer network and P2 as consumer
Instance. After the records are verified, T2 network. Then enable DNSSEC with gcloud dns managed-zones
will access and look up more records in update zone-name -dnssec-state on.
this Compute Engine Instance.
C. Create a managed reverse lookup private zone with P1 and P2’s VPCs in the
VPC network list. Keep visibility as private. Add the required domain names in
How would you enable the lookup access
dns-names while creating the managed zone.
to ensure that the requests are always
authenticated and are protected against D. Create a cross-project binding zone by creating a private zone with the URL of
exfiltration? P2’s VPC network. Then enable DNSSEC with gcloud dns
managed-zones update zone-name -dnssec-state on.
2.2 Diagnostic Question 05
Cymbal Bank needs to connect its A. Create service accounts for the application and database. Create a firewall rule using:
gcloud compute firewall-rules create ALLOW_MONGO_DB
employee MongoDB database to a new --network network-name
human resources web application on --allow TCP:27017
the same network. Both the database --source-service-accounts web-application-service-account
--target-service-accounts database-service-account
and the application are autoscaled with
the help of Instance templates. As the A. Create service accounts for the application and database. Create a firewall rule using:
gcloud compute firewall-rules create ALLOW_MONGO_DB
Security Administrator and Project
--network network-name
Editor, you have been tasked with --allow ICMP:27017
allowing the application to read port --source-service-accounts web-application-service-account
--target-service-accounts database-service-account
27017 on the database.
A. Create a user account for the database admin and a service account for the application. Create a firewall rule using:
gcloud compute firewall-rules create ALLOW_MONGO_DB
--network network-name
What should you do? --allow TCP:27017
--source-service-accounts web-application-service-account
--target-service-accounts database-admin-user-account

A. Create user accounts for the application and database. Create a firewall rule using:
gcloud compute firewall-rules create ALLOW_MONGO_DB
--network network-name
--deny UDP:27017
--source-service-accounts web-application-user-account
--target-service-accounts database-admin-user-account
2.2 Diagnostic Question 06
Cymbal Bank has designed an
application to detect credit card fraud A. Use subnet isolation. Create a service account for the fraud detection VM. Create one service account for all the teams’
that will analyze sensitive information. Compute Engine instances that will access the fraud detection VM. Create a new firewall rule using:
The application that’s running on a gcloud compute firewall-rules create ACCESS_FRAUD_ENGINE
--network <network name>
Compute Engine instance is hosted in a
--allow TCP:80
new subnet on an existing VPC. --source-service-accounts <one service account for all teams>
Multiple teams who have access to --target-service-accounts <fraud detection engine’s service account>
other VMs in the same VPC must
access the VM. You want to configure B. Use target filtering. Create two tags called ‘app’ and ‘data’. Assign the ‘app’ tag to the Compute Engine instance hosting the
the access so that unauthorized VMs or Fraud Detection App (source), and assign the ‘data’ tag to the other Compute Engine instances (target). Create a firewall rule
users from the internet can’t access the to allow all ingress communication on this tag.
fraud detection VM.
C. Use subnet isolation. Create a service account for the fraud detection engine. Create service accounts for each of the teams’
Compute Engine instances that will access the engine. Add a firewall rule using:
gcloud compute firewall-rules create ACCESS_FRAUD_ENGINE
--network <network name>
What should you do? --allow TCP:80
--source-service-accounts <list of service accounts>
--target-service-accounts <fraud detection engine’s service account>

D. Use target filtering. Create a tag called ‘app’, and assign the tag to both the source and the target. Create a firewall rule to
allow all ingress communication on this tag.
Proprietary + Confidential

Configuring network
2.2 segmentation

Courses Skill Badges Documentation


Networking in Google Cloud DNS zones overview | Google Cloud

● M2 Controlling Access to VPC Using firewall rules | VPC | Google Cloud


Google Cloud
Networks Best practices and reference architectures for
Security in Google Cloud Build and Secure VPC design
● M4 Configuring VPC for Isolation Networks in Google Best practices and reference architectures for
and Security Cloud Quest VPC design
Best practices for securing service accounts |
Networking in Google Cloud: Defining Cloud IAM Documentation
and implementing networks
● M2 Controlling Access to VPC
Networks
Managing Security in Google Cloud
● M4 Configuring VPC for Isolation
and Security
2.3 Diagnostic Question 07
The data from Cymbal Bank’s loan
applicants resides in a shared VPC. A. Add egress firewall rules to allow TCP and UDP ports for the App Engine standard environment in the Shared VPC
A credit analysis team uses a CRM network. Create either a client-side connector in the Service Project or a server-side connector in the Host Project
tool hosted in the App Engine using the IP Range or Project ID of the target VPC. Verify that the connector is in a READY state. Create an egress
standard environment. You need to rule on the Shared VPC network to allow the connector using Network Tags or IP ranges.

provide credit analysts with access


B. Add egress firewall rules to allow SSH and/or RDP ports for the App Engine standard environment in the Shared VPC
to this data. You want the charges network. Create a client-side connector in the Service Project using the IP range of the target VPC. Verify that the
to be incurred by the credit analysis connector is in a READY state. Create an egress rule on the Shared VPC network to allow the connector using
team. Network Tags or IP ranges.

A. Add ingress firewall rules to allow NAT and Health Check ranges for the App Engine standard environment in the
Shared VPC network. Create a client-side connector in the Service Project using the Shared VPC Project ID. Verify
What should you do? that the connector is in a READY state. Create an ingress rule on the Shared VPC network to allow the connector
using Network Tags or IP ranges.

A. Add ingress firewall rules to allow NAT and Health Check ranges for App Engine standard environment in the Shared
VPC network. Create a server-side connector in the Host Project using the Shared VPC Project ID. Verify that the
connector is in a READY state. Create an ingress rule on the Shared VPC network to allow the connector using
Network Tags or IP ranges.
2.3 Diagnostic Question 08

Cymbal Bank’s Customer Details API runs on a A. Use a Content Delivery Network (CDN). Establish direct peering with
Compute Engine instance with only an internal one of Google’s nearby edge-enabled PoPs.
IP address. Cymbal Bank’s new branch is B. Use Carrier Peering. Use a service provider to access their enterprise
co-located outside the Google Cloud grade infrastructure to connect to the Google Cloud environment.
points-of-presence (PoPs) and requires a
low-latency way for its on-premises apps to C. Use Partner Interconnect. Use a service provider to access their
consume the API without exposing the requests enterprise grade infrastructure to connect to the Google Cloud
to the public internet. environment.
D. Use Dedicated Interconnect. Establish direct peering with one of
Google’s nearby edge-enabled PoPs.
Which solution would you recommend?
2.3 Diagnostic Question 9
An external audit agency needs to perform A. Use a Cloud VPN tunnel. Use your DNS provider to create DNS zones and records for
a one-time review of Cymbal Bank’s private.googleapis.com. Connect the DNS provider to your on-premises network. Broadcast
Google Cloud usage. The auditors should the request from the on-premises environment. Use a software-defined firewall to manage
be able to access a Default VPC incoming and outgoing requests.
containing BigQuery, Cloud Storage, and B. Use Partner Interconnect. Configure an encrypted tunnel in the auditor's on-premises
Compute Engine instances where all the environment. Use Cloud DNS to create DNS zones and A records for
usage information is stored. You have private.googleapis.com.
been tasked with enabling the access from
their on-premises environment, which C. Use a Cloud VPN tunnel. Use Cloud DNS to create DNS zones and records for
already has a configured VPN. *.googleapis.com. Set up on-premises routing with Cloud Router. Use Cloud Router custom
route advertisements to announce routes for Google Cloud destinations.
D. Use Direct Interconnect. Configure a VLAN in the auditor's on-premises environment. Use
What should you do? Cloud DNS to create DNS zones and records for restricted.googleapis.com and
private.googleapis.com. Set up on-premises routing with Cloud Router. Add custom static
routes in the VPC to connect individually to BigQuery, Cloud Storage, and Compute Engine
instances.
2.3 Diagnostic Question 10

An external audit agency needs to perform A. Cloud DNS, subnet primary IP address range for nodes, and subnet
a one-time review of Cymbal Bank’s secondary IP address range for pods and services in the cluster
Google Cloud usage. The auditors should B. Cloud VPN, subnet secondary IP address range for nodes, and subnet
be able to access a Default VPC secondary IP address range for pods and services in the cluster
containing BigQuery, Cloud Storage, and
Compute Engine instances where all the C. Nginx load balancer, subnet secondary IP address range for nodes, and
usage information is stored. You have subnet secondary IP address range for pods and services in the cluster
been tasked with enabling the access from D. Cloud NAT gateway, subnet primary IP address range for nodes, and subnet
their on-premises environment, which secondary IP address range for pods and services in the cluster
already has a configured VPN.

What should you do?


Proprietary + Confidential

2.3 Establish private connectivity

Courses Skill Badges Documentation


Networking in Google Cloud
Configuring Serverless VPC Access | Google
● M5 Hybrid Connectivity Cloud
● M7 Network Design and Deployment
Google Cloud Overview of VPC Service Controls | Google
Security in Google Cloud
Cloud
● M4 Configuring VPC for Isolation and Build and Secure
Security Networks in Google Choosing a Network Connectivity product |
● M5 Securing Compute Engine:
Cloud Quest Google Cloud
Techniques and Best Practices
Private Google Access | VPC

Networking in Google Cloud: Hybrid


Manage zones | Cloud DNS
Connectivity and Network Management Private Google Access for on-premises hosts |
● M1 Hybrid Connectivity VPC
● M3 Network Design and Deployment Google Cloud
Simplifying cloud networking for enterprises:
Managing Security in Google Cloud
● M4 Configuring VPC for Isolation and Ensure Access and announcing Cloud NAT and more | Google Cloud
Security Identity in Google Blog
Security Best Practices in Google Cloud Cloud Quest
Example GKE setup | Cloud NAT
● M1 Securing Compute Engine: Techniques
and Best Practices Cloud NAT overview
Section 3:
Ensuring data protection
3.1 Diagnostic Question 01

Cymbal Bank has hired a data analyst


A. Use the Cloud Data Loss Prevention (DLP) API to make redact image
team to analyze scanned copies of loan
requests. Provide your project ID, built-in infoTypes, and the scanned
applications. Because this is an external
copies when you make the requests.
team, Cymbal Bank does not want to
share the name, gender, phone number, B. Use the Cloud Vision API to perform optical code recognition (OCR)
or credit card numbers listed in the from scanned images. Redact the text using the Cloud Natural
scanned copies. You have been tasked Language API with regular expressions.
with hiding this PII information while C. Use the Cloud Vision API to perform optical code recognition (OCR)
minimizing latency. from scanned images. Redact the text using the Cloud Data Loss
Prevention (DLP) API with regular expressions.
D. Use the Cloud Vision API to perform text extraction from scanned
What should you do? images. Redact the text using the Cloud Natural Language API with
regular expressions.
3.1 Diagnostic Question 02

Cymbal Bank needs to statistically predict


A. Generalize all dates to year and month with bucketing. Use the
the days customers delay the payments for
built-in infoType for customer name. Use a custom infoType for
loan repayments and credit card
customer type with a custom dictionary.
repayments. Cymbal Bank does not want
to share the exact dates a customer has B. Generalize all dates to year and month with bucketing. Use the
defaulted or made a payment with data built-in infoType for customer name. Use a custom infoType for
analysts. Additionally, you need to hide the customer type with regular expression.
customer name and the customer type, C. Generalize all dates to year and month with date shifting. Use a
which could be corporate or retail. predefined infoType for customer name. Use a custom infoType for
customer type with a custom dictionary.
How do you provide the appropriate D. Generalize all dates to year and month with date shifting. Use a
information to the data analysts? predefined infoType for customer name. Use a custom infoType for
customer type with regular expression.
3.1 Diagnostic Question 03
Cymbal Bank stores customer information in a
BigQuery table called ‘Information,’ which A. Create separate datasets for each department. Create views for each dataset
belongs to the dataset ‘Customers.’ Various separately. Authorize these views to access the source dataset. Share the datasets
departments of Cymbal Bank, including loan, with departments. Provide the bigquery.dataViewer role to each department’s required
credit card, and trading, access the information users.
table. Although the data source remains the
B. Create an authorized dataset in BigQuery’s Explorer panel. Write Customers’ table
same, each department needs to read and
metadata into a JSON file, and edit the file to add each department’s Project ID and
analyze separate customers and
Dataset ID. Provide the bigquery.user role to each department’s required users.
customer-attributes. You want a cost-effective
way to configure departmental access to C. Secure data with classification. Open the Data Catalog Taxonomies page in the
BigQuery to provide optimal performance. Google Cloud Console. Create policy tags for required columns and rows. Provide the
bigquery.user role to each department’s required users. Provide policy tags access to
each department separately.
D. Create separate datasets for each department. Create authorized functions in each
dataset to perform required aggregations. Write transformed data to new tables for
What should you do?
each department separately. Provide the bigquery.dataViewer role to each
department’s required users.
3.1 Diagnostic Question 04
Cymbal Bank has two vendors who need to A. Use VPC Service Controls with Service perimeter bridges. Use the gcloud
collaborate on the same files and images, and access-context-manager perimeters command and use project IDs of two
each vendor is represented by Google Groups. vendors with –resources while the ‘ForVendors’ project is selected. Use Identity and
Each vendor will perform different sets of Access Management (IAM) to provide appropriate permissions.
transformations on these files. Cymbal Bank has B. Use VPC Service Controls with Context-aware access with ingress rules. Use the
provided a perimeter network with lower trust command gcloud access-context-manager perimeters update and set
where Projects for the two vendors are also ingress rules for the vendor projects. Use IAM to provide appropriate permissions.
hosted along with a Project ‘ForVendors,’ which
contains the files and images in Cloud Storage. C. Use VPC Service Controls with Service perimeter bridges. Use the command gcloud
access-context-manager perimeters and use project IDs of each vendor and
bank project with --resources separately. Use IAM to provide appropriate permissions.

How would you configure access in the D. Use VPC Service Controls with Context-aware access with ingress rules. Use the
vendor Projects so that vendors can’t command gcloud access-context-manager perimeters update and set
communicate with each other, but can still ingress rules for the bank’s bucket in the vendor’s Cloud Storage buckets separately.
copy the data from the bank’s Cloud Storage Use IAM to provide appropriate permissions.
bucket?
3.1 Diagnostic Question 05

Cymbal Bank has a Cloud SQL instance A. Use Secret Manager. Use the duration attribute to set the expiry period to one year. Add
that must be shared with an external the secretmanager.secretAccessor role for the group that contains external developers.
agency. The agency’s developers will be
B. Use Cloud Key Management Service. Use the destination IP address and Port attributes
assigned roles and permissions through a to provide access for developers at the external agency. Remove the IAM access after one
Google Group in Identity and Access year and rotate the shared keys. Add cloudkms.cryptoKeyEncryptorDecryptor role for the
Management (IAM). The external agency group that contains the external developers.
is on an annual contract and will require a
connection string, username, and C. Use Secret Manager. Use the resource attribute to set a key-value pair with key as
password to connect to the database. duration and values as expiry period one year from now. Add secretmanager.viewer role
for the group that contains external developers.
D. Use Secret Manager for the connection string and username, and use Cloud Key
Management Service for the password. Use tags to set the expiry period to the timestamp
How would you configure the
one year from now. Add secretmanager.secretVersionManager and
group’s access?
secretmanager.secretAccessor roles for the group that contains external developers.
3.1 Diagnostic Question 06
Cymbal Bank wants to deploy an n-tier web A. Use VM metadata to read the current machine’s IP address, and use a gcloud command
application. The frontend must be to add access to Cloud SQL. Store Cloud SQL’s connection string and password in Cloud
supported by an App Engine deployment, Key Management Service. Store the Username in Project metadata.
an API with a Compute Engine instance, B. Use Project metadata to read the current machine’s IP address, and use a startup script
and Cloud SQL for a MySQL database. to add access to Cloud SQL. Store Cloud SQL’s connection string in Cloud Key
This application is only supported during Management Service, and store the password in Secret Manager. Store the Username in
working hours, App Engine is disabled, and Project metadata.
Compute Engine is stopped. How would
you enable the infrastructure to access the C. Use Project metadata to read the current machine’s IP address and use a gcloud
database? command to add access to Cloud SQL. Store Cloud SQL’s connection string and
username in Cloud Key Management Service, and store the password in Secret
Manager.
D. Use VM metadata to read the current machine’s IP address and use a startup script to
add access to Cloud SQL. Store Cloud SQL’s connection string, username, and
password in Secret Manager.
How would you enable the infrastructure to
access the database?
Proprietary + Confidential

Documentation

3.1 Protecting sensitive data Image inspection and redaction | Data Loss
Prevention Documentation | Google Cloud
Redacting sensitive data from images | Data
Loss Prevention Documentation | Google Cloud
InfoType detector reference | Data Loss
Courses Prevention Documentation | Google Cloud
Pseudonymization | Data Loss Prevention
Documentation | Google Cloud
Security in Google Cloud Managing Security in Google Cloud
● M4 Configuring Virtual Private ● M4 Configuring Virtual Private Cloud Authorized views | BigQuery | Google Cloud
Cloud for Isolation and Security for Isolation and Security Authorized datasets | BigQuery | Google Cloud
● M5 Securing Compute Engine:
Security Best Practices in Google Cloud Sharing across perimeters with bridges | VPC
Techniques and Best Practices Service Controls | Google Cloud
● M1 Securing Compute Engine:
● M6 Securing Cloud Data:
Techniques and Best Practices Creating a perimeter bridge | VPC Service
Techniques and Best Practices
● M2 Securing Cloud Data: Controls | Google Cloud
● M7 Application Security:
Techniques and Best Practices
Techniques and Best Practices Context-aware access with ingress rules | VPC
● M3 Application Security: Techniques
● M10 Content-Related Service Controls | Google Cloud
and Best Practices
Vulnerabilities: Techniques and Frequently asked questions | Cloud IAM
Best Practices Mitigating Security Vulnerabilities in
Documentation
Google Cloud
● M3 Monitoring, Logging, Auditing, Access control with IAM | Secret Manager
and Scanning Documentation | Google Cloud
About VM metadata | Compute Engine
Documentation | Google Cloud
3.2 Diagnostic Question 07
Cymbal Bank calculates employee
incentives on a monthly basis for the A. Import the spreadsheets to BigQuery, and create separate tables for Sales and
sales department and on a quarterly Marketing. Set table expiry rules to 365 days for both tables. Create jobs scheduled to
basis for the marketing department. The run every quarter for Marketing and every month for Sales.
incentives are released with the next B. Upload the spreadsheets to Cloud Storage. Select the Nearline storage class for the
month’s salary. Employee’s sales department and Coldline storage for the marketing department. Use object lifecycle
performance documents are stored as management rules to set the storage class to Archival after 365 days. Process the data
spreadsheets, which are retained for at on BigQuery using jobs that run monthly for Sales and quarterly for Marketing.
least one year for audit. You want to
C. Import the spreadsheets to Cloud SQL, and create separate tables for Sales and
configure the most cost-effective
Marketing. For Table Expiration, set 365 days for both tables. Use stored procedures to
storage for this scenario.
calculate incentives. Use App Engine cron jobs to run stored procedures monthly for
Sales and quarterly for Marketing.
D. Import the spreadsheets into Cloud Storage and create NoSQL tables. Use App Engine
What should you do? cron jobs to run monthly for Sales and quarterly for Marketing. Use a separate job to
delete the data after 1 year.
3.2 Diagnostic Question 08
Cymbal Bank uses Google Kubernetes A. In the Google Cloud console, navigate to Google Kubernetes Engine. Select your
Engine (GKE) to deploy its Docker cluster and the boot node inside the cluster. Enable customer-managed encryption.
containers. You want to encrypt the boot Use Cloud HSM to generate random bytes and provide an additional layer of security.
disk for a cluster running a custom image B. Create a new GKE cluster with customer-managed encryption and HSM enabled.
so that the key rotation is controlled by Deploy the containers to this cluster. Delete the old GKE cluster. Use Cloud HSM to
the Bank. GKE clusters will also generate generate random bytes and provide an additional layer of security.
up to 1024 randomized characters that
will be used with the keys with Docker C. Create a new key ring using Cloud Key Management Service. Extract this key to a
containers. certificate. Use the kubectl command to update the Kubernetes configuration. Validate
using MAC digital signatures, and use a startup script to generate random bytes.
D. Create a new key ring using Cloud Key Management Service. Extract this key to a
certificate. Use the Google Cloud Console to update the Kubernetes configuration.
What steps would you take to apply the Validate using MAC digital signatures, and use a startup script to generate random
encryption settings with a dedicated bytes.
hardware security layer?
3.2 Diagnostic Question 09

Cymbal Bank has an equated monthly A. Use manual key rotation and assign yourself the
installment (EMI) application. This cloudkms.cryptoKeyEncrypterDecrypter role.
application must comply with PCI-DSS B. Use automatic key rotation and assign yourself the
standards because it stores credit card cloudkms.cryptoKeyEncrypterDecrypter role.
information. For additional security, you use
asymmetric keys to encrypt the data and C. Use automatic key rotation and assign yourself the cloudkms.admin role.
rotate the keys at fixed intervals. Cymbal D. Use manual key rotation and assign yourself the cloudkms.admin role.
Bank has recently migrated to Google
Cloud, and you need to set up key rotation.

How would you configure Cloud


Key Management Service (KMS)?
3.2 Diagnostic Question 10
Cymbal Bank needs to migrate existing
loan processing applications to Google
Cloud. These applications transform
confidential financial information. All the A. Create a Confidential VM instance with Customer-Supplied Encryption Keys. In
data should be encrypted at all stages, Cloud Logging, collect all logs for sevLaunchAttestationReportEvent.
including sharing between sockets and
RAM. An integrity test should also be B. Create a Shielded VM instance with Customer-Supplied Encryption Keys. In Cloud
performed every time these instances Logging, collect all logs for earlyBootReportEvent.
boot. You need to use Cymbal Bank’s C. Create a Confidential VM instance with Customer-Managed Encryption Keys. In
encryption keys to configure the Cloud Logging, collect all logs for earlyBootReportEvent.
Compute Engine instances.
D. Create a Shielded VM instance with Customer-Managed Encryption Keys. In
Cloud Logging, collect all logs for sevLaunchAttestationReportEvent.

What should you do?


Proprietary + Confidential

3.2 Managing encryption at rest

Courses Skill Badges Documentation


Security in Google Cloud Storage classes | Google Cloud
● M5 Securing Compute Engine: Object Lifecycle Management | Cloud Storage
Techniques and Best Practices Google Cloud
● M6 Securing Cloud Data: Use customer-managed encryption keys (CMEK)
Techniques and Best Practices Ensure Access and | Kubernetes Engine Documentation | Google
Identity in Google Cloud
Cloud Quest Configuring a custom boot disk | Kubernetes
Security Best Practices in Google Cloud Engine Documentation | Google Cloud
● M1 Securing Compute Engine: Using Cloud KMS with other products
Techniques and Best Practices
● M2 Securing Cloud Data: Rotating keys | Cloud KMS Documentation
Techniques and Best Practices Confidential VM and Compute Engine | Google
Cloud
Section 4:
Managing operations in a
cloud solution environment
4.1 Diagnostic Question 01

Cymbal Bank has received Docker A. Prepare a cloudbuild.yaml file. In this file, add four steps in order—build, scan, severity
source files from its third-party check, and push—specifying the location of Artifact Registry repository. Specify severity
developers in an Artifact Registry level as CRITICAL. Start the build with the command gcloud builds submit.
repository. These Docker fil2es will be
part of a CI/CD pipeline to update B. Prepare a cloudbuild.yaml file. In this file, add four steps in order—scan, build, severity
Cymbal Bank’s personal loan offering. check, and push—specifying the location of the Artifact Registry repository. Specify
The bank wants to prevent the severity level as HIGH. Start the build with the command gcloud builds submit.
possibility of remote users arbitrarily
using the Docker files to run any
A. Prepare a cloudbuild.yaml file. In this file, add four steps in order—scan, severity check,
code. You have been tasked with
build, and—push specifying the location of the Artifact Registry repository. Specify
using Container Analysis’
severity level as HIGH. Start the build with the command gcloud builds submit.
On-Demand scanning to scan the
images for a one-time update.
A. Prepare a cloudbuild.yaml file. In this file, add four steps in order—build, severity check,
scan, and push—specifying the location of the Artifact Registry repository. Specify
What should you do?
severity level as CRITICAL. Start the build with the command gcloud builds submit.
4.1 Diagnostic Question 02

Cymbal Bank’s management is A. Set an organization-level policy that requires all Compute Engine VMs to be configured as
concerned about virtual machines Shielded VMs. Use Secure Boot enabled with Unified Extensible Firmware Interface (UEFI).
being compromised by bad actors. Validate integrity events in Cloud Monitoring and place alerts on launch attestation events.
More specifically, they want to B. Set Cloud Logging measurement policies on the VMs. Use Cloud Logging to place alerts
receive immediate alerts if there whenever actualMeasurements and policyMeasurements don’t match.
have been changes to the boot
sequence of any of their Compute C. Set an organization-level policy that requires all Compute Engine VMs to be configured as
Engine instances. Shielded VMs. Use Measured Boot enabled with Virtual Trusted Platform Module (vTPM).
Validate integrity events in Cloud Monitoring and place alerts on late boot validation events.
D. Set project-level policies that require all Compute Engine VMs to be configured as Shielded VMs.
Use Measured Boot enabled with Virtual Trusted Platform Module (vTPM). Validate integrity
What should you do?
events in Cloud Monitoring and place alerts on late boot validation events.
4.1 Diagnostic Question 03
Cymbal Bank runs a Node.js
application on a Compute Engine
instance. Cymbal Bank needs to A. Prepare a shell script. Add the command gcloud compute instances stop with the Node.js instance
share this base image with a name. Set up certificates for secure boot. Add gcloud compute images create, and specify the
‘development’ Google Group. This Compute Engine instance’s persistent disk and zone and the certificate files. Add gcloud compute
base image should support secure images add-iam-policy-binding and specify the ‘development’ group.
boot for the Compute Engine B. Start the Compute Engine instance. Set up certificates for secure boot. Prepare a cloudbuild.yaml
instances deployed from this configuration file. Specify the persistent disk location of the Compute Engine and the ‘development’
image. How would you automate group. Use the command gcloud builds submit --tag, and specify the configuration file path and the
the image creation? certificates.
C. Prepare a shell script. Add the command gcloud compute instances start to the script to start the
Node.js Compute Engine instance. Set up Measured Boot for secure boot. Add gcloud compute
images create, and specify the persistent disk and zone of the Compute Engine instance.
How would you automate D. Stop the Compute Engine instance. Set up Measured Boot for secure boot. Prepare a
the image creation? cloudbuild.yaml configuration file. Specify the persistent disk location of the Compute Engine
instance and the ‘development’ group. Use the command gcloud builds submit --tag, and specify
the configuration file path.
4.1 Diagnostic Question 04
Cymbal Bank uses Docker A. Create a Dockerfile with container definition and cloudbuild.yaml file.
containers to interact with APIs for Use Cloud Build to build the image from Dockerfile. Upload the built
its personal banking application. image to a Google Container registry and Dockerfile to a Git repository.
These APIs are under PCI-DSS In the cloudbuild.yaml template, include attributes to tag the Git repository
compliance. The Kubernetes path with a Google Kubernetes Engine cluster. Create a trigger in Cloud Build to automate the deployment
environment running the using the Git repository.
containers will not have internet B. Create a Dockerfile with a container definition and a Cloud Build configuration file. Use the Cloud Build
access to download required configuration file to build and deploy the image from Dockerfile to a Google Container registry. In the
packages. configuration file, include the Google Container Registry path and the Google Kubernetes Engine cluster.
Upload the configuration file to a Git repository. Create a trigger in Cloud Build to automate the deployment
using the Git repository.
C. Build a foundation image. Store all artifacts and a Packer definition template in a Git repository. Use
Container Registry to build the artifacts and Packer definition. Use Cloud Build to extract the built container
How would you automate the and deploy it to a Google Kubernetes Engine (GKE) cluster. Add the required users and groups to the GKE
project.
pipeline that is building these
containers? D. Build an immutable image. Store all artifacts and a Packer definition template in a Git repository. Use
Container Registry to build the artifacts and Packer definition. Use Cloud Build to extract the built container
and deploy it to a Google Kubernetes Engine Cluster (GKE). Add the required users and groups to the GKE
project.
Proprietary + Confidential

Building and deploying secure


4.1 infrastructure and applications Documentation
Using On-Demand Scanning in your Cloud Build
pipeline | Container Analysis documentation |
Google Cloud
Courses Skill Badges Container scanning | Container Analysis
documentation | Google Cloud
Security in Google Cloud Creating custom shielded images | Shielded VM |
● M5 Securing Compute Engine: Techniques Google Cloud
and Best Practices
● M7 Application Security: Techniques and Best Google Cloud Creating, deleting, and deprecating custom images
Practices
| Compute Engine Documentation | Google Cloud
● M8 Securing Kubernetes: Techniques and Secure Workloads in
Best Practices Managing access to custom images | Compute
● M11 Monitoring, Logging, Auditing, and Google Kubernetes
Engine Documentation | Google Cloud
Scanning Engine Quest
Image management best practices | Compute
Security Best Practices in Google Cloud Engine Documentation | Google Cloud
● M1 Securing Compute Engine: Techniques
and Best Practices Deploying to GKE | Cloud Build Documentation
● M3 Application Security: Techniques and Best
Practices Quickstart: Build and push a Docker image with
● M4 Securing Kubernetes: Techniques and Cloud Build
Best Practices
Automated image builds with Jenkins, Packer, and
Mitigating Security Vulnerabilities in Google Kubernetes | Cloud Architecture Center | Google
Cloud
● M3 Monitoring, Logging, Auditing, and Cloud
Scaning
4.2 Diagnostic Question 05
Cymbal Bank has Docker
applications deployed in Google
Kubernetes Engine. The bank has A. View the GKE logs in Cloud Logging. Use the log scoping tool to filter the Firewall Rules
no offline containers. This GKE log. Create a Pub/Sub topic. Export the logs to a Pub/Sub topic using the command
cluster is exposed to the public gcloud logging sinks create. Use Dataflow to read from Pub/Sub and query the stream.
internet and has recently recovered B. View the GKE logs in the local GKE cluster. Use the kubectl Sysdig Capture tool to filter
from an attack. Cymbal Bank the Firewall Rules log. Create a Pub/Sub topic. Export these logs to a Pub/Sub topic
suspects that someone in the using the GKE cluster. Use Dataflow to read from Pub/Sub and query the stream.
organization changed the firewall
rules and has tasked you to analyze C. View the GKE logs in the local GKE cluster. Use Docker-explorer to explore the Docker
and find all details related to the file system. Filter and export the Firewall logs to Cloud Logging. Create a dataset in
firewall for the cluster. You want the BigQuery to accept the logs. Use the command gcloud logging sinks create to export the
most cost-effective solution for this logs to a BigQuery dataset. Query this dataset.
task. D. View the GKE logs in Cloud Logging. Use the log scoping tool to filter the Firewall Rules
What should you do? log. Create a dataset in BigQuery to accept the logs. Export the logs to BigQuery using
the command gcloud logging sinks create. Query this dataset.
4.2 Diagnostic Question 06
Cymbal Bank experienced a
recent security issue. A rogue
A. Use Event Threat Detection and configure Continuous Exports to filter and write only Firewall logs to the
employee with admin permissions
Security Command Center. In the Security Command Center, select Event Threat Detection as the
for Compute Engine assigned
source, filter by evasion: Iam, and sort to find the attack time window. Click on Persistence: IAM
existing Compute Engine users
Anomalous Grant to display Finding Details. View the Source property of the Finding Details section.
some arbitrary permissions. You
are tasked with finding all these B. Use Event Threat Detection and configure Continuous Exports to filter and write only Firewall logs to the
arbitrary permissions. Security Command Center. In the Security Command Center, select Event Threat Detection as the
source, filter by category: anomalies, and sort to find the attack time window. Click on Evasion: IAM
Anomalous Grant to display Finding Details. View the Source property of the Finding Details section.
C. Use Event Threat Detection and trigger the IAM Anomalous grants detector. Publish results to the Security
Command Center. In the Security Command Center, select Event Threat Detection as the source, filter
What should you do to find these
by category: iam, and sort to find the attack time window. Click on Persistence: IAM Anomalous Grant
permissions most efficiently?
to display Finding Details. View the Source property of the Finding Details section.
D. Use Event Threat Detection and trigger the IAM Anomalous Grant detector. Publish results to Cloud
Logging. In the Security Command Center, select Cloud Logging as the source, filter by category:
anomalies, and sort to find the attack time window. Click on Persistence: IAM Anomalous Grant to
display Finding Details. View the Source property of the Finding Details section.
4.2 Diagnostic Question 07

Cymbal Bank wants to use A. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE at
Cloud Storage and BigQuery to the service level for BigQuery and Cloud Storage.
store safe deposit usage data. B. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE at
Cymbal Bank needs a the organization level.
cost-effective approach to
auditing only Cloud Storage and C. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE for
BigQuery data access activities. Cloud Storage. All Data Access Logs are enabled for BigQuery by default.
D. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE for
BigQuery. All Data Access Logs are enabled for Cloud Storage by default.
How would you use Cloud Audit
Logs to enable this analysis?
4.2 Diagnostic Question 08

Cymbal Bank has suffered a remote A. Use Event Threat Detection. Trigger the IAM Anomalous Grant detector to detect all admins
botnet attack on Compute Engine and users with admin or system permissions. Export these logs to the Security Command
instances in an isolated project. The Center. Give the external agency access to the Security Command Center.
affected project now requires
B. Use Cloud Audit Logs. Filter Admin Activity audit logs for only the affected project. Use a
investigation by an external agency.
Pub/Sub topic to stream the logs from Cloud Audit Logs to the external agency’s forensics
An external agency requests that you
tool.
provide all admin and system events
to analyze in their local forensics tool. C. Use the Security Command Center. Select Cloud Logging as the source, and filter by
You want to use the most category: Admin Activity and category: System Activity. View the Source property of the
cost-effective solution to enable the Finding Details section. Use Pub/Sub topics to export the findings to the external agency’s
external analysis. forensics tool.
D. Use Cloud Monitoring and Cloud Logging. Filter Cloud Monitoring to view only system and
What should you do? admin logs. Expand the system and admin logs in Cloud Logging. Use Pub/Sub to export the
findings from Cloud Logging to the external agency’s forensics tool or storage.
4.2 Diagnostic Question 09

The loan application from Cymbal A. Set up a logging export dataset in BigQuery to collect data from Cloud Logging and
Bank’s lending department collects Cloud Monitoring. Create table expiry rules to delete logs after three years.
credit reports that contain credit B. Set up a logging export dataset in BigQuery to collect data from Cloud Logging and
payment information from customers. the Security Command Center. Create table expiry rules to delete logs after three
According to bank policy, the PDF years.
reports are stored for six months in
Cloud Storage, and access logs for the C. Set up a logging export bucket in Cloud Storage to collect data from the Security
reports are stored for three years. You Command Center. Configure object lifecycle management rules to delete logs after
need to configure a cost-effective three years.
storage solution for the access logs. D. Set up a logging export bucket in Cloud Storage to collect data from Cloud Audit
Logs. Configure object lifecycle management rules to delete logs after three years.

What should you do?


4.2 Diagnostic Question 10

Cymbal Bank uses Compute


Engine instances for its APIs, and A. Use Event Threat Detection’s threat detectors. Export findings from ‘Suspicious account activity’
recently discovered bitcoin mining and ‘Anomalous IAM behavior’ detectors and publish them to a Pub/Sub topic. Create a Cloud
activities on some instances. The Function to send notifications of suspect activities. Use Pub/Sub notifications to invoke the Cloud
bank wants to detect all future Function.
mining attempts and notify the B. Enable the VM Manager tools suite in the Security Command Center. Perform a scan of Compute
security team. The security team Engine instances. Publish results to Cloud Audit Logging. Create an alert in Cloud Monitoring to
can view the Security Command send notifications of suspect activities.
Center and Cloud Audit Logs.
C. Enable Anomaly Detection in the Security Command Center. Create and configure a Pub/Sub
topic and an email service. Create a Cloud Function to send email notifications for suspect
activities. Export findings to a Pub/Sub topic, and use them to invoke the Cloud Function.

How should you configure the D. Enable the Web Security Scanner in the Security Command Center. Perform a scan of Compute
detection and notification? Engine instances. Publish results to Cloud Audit Logging. Create an alert in Cloud Monitoring to
send notifications for suspect activities.
Proprietary + Confidential

Configuring logging,
4.2 monitoring, and detection Documentation
Security controls and forensic analysis for GKE
apps | Cloud Architecture Center
Courses Scenarios for exporting logging data: Security
and access analytics | Cloud Architecture Center
| Google Cloud
Security in Google Cloud Mitigating Security Vulnerabilities in
● M11 Monitoring, Logging, Google Cloud Security controls and forensic analysis for GKE
Auditing, and Scanning ● M3 Monitoring, Logging, Auditing, apps | Cloud Architecture Center
and Scanning
Cloud Audit Logs overview
Cloud Audit Logs with Cloud Storage | Google
Cloud
Configure Data Access audit logs
Scenarios for exporting Cloud Logging:
Compliance requirements | Cloud Architecture
Center | Google Cloud
Security sources for vulnerabilities and threats |
Security Command Center | Google Cloud
Configuring Security Command Center
Enabling real-time email and chat notifications
Section 5:
Ensuring compliance
5.1 Diagnostic Question 01

Cymbal Bank’s lending department stores A. Generate an AES-256 key as a 32-byte bytestring. Decode
sensitive information, such as your customers’ it as a base-64 string. Upload the blob to the bucket using
credit history, address and phone number, in this key.
parquet files. You need to upload this B. Generate an RSA key as a 32-byte bytestring. Decode it as
personally identifiable information (PII) to a base-64 string. Upload the blob to the bucket using this
Cloud Storage so that it’s secure and key.
compliant with ISO 27018. C. Generate a customer-managed encryption key (CMEK)
using RSA or AES256 encryption. Decode it as a base-64
string. Upload the blob to the bucket using this key.
How should you protect this sensitive
information using Cymbal Bank’s encryption D. Generate a customer-managed encryption key (CMEK)
using Cloud KMS. Decode it as a base-64 string. Upload
keys and using the least amount of
the blob to the bucket using this key.
computational resources?
5.1 Diagnostic Question 02

You are designing a web application for A. Use customer-supplied encryption keys (CSEK) and Cloud
Cymbal Bank so that customers who have Key Management Service (KMS) to detect and encrypt
credit card issues can contact dedicated sensitive information.
support agents. Customers may enter their B. Detect sensitive information with Cloud Natural Language
complete credit card number when chatting API.
with or emailing support agents. You want to C. Use customer-managed encryption keys (CMEK) and Cloud
ensure compliance with PCI-DSS and prevent Key Management Service (KMS) to detect and encrypt
support agents from viewing this information sensitive information.
in the most cost-effective way. D. Implement Cloud Data Loss Prevention using its REST API.

What should you do?


5.1 Diagnostic Question 03
Cymbal Bank wants to launch a new website
for their customers to enter their personal A. Create the BigQuery dataset and Cloud Storage bucket in Europe. Change the project that contains
details and calculate their credit scores. The BigQuery data to a new VPC with configured access to BigQuery and Cloud Storage. Add the external
data will be stored in BigQuery tables and analysts to another Project. Use Shared VPC to share the configured Project with the external analyst’s
Cloud Storage buckets following GDPR Project. Use Identity Access Management (IAM) to provide the Editor role to the external analysts.
compliance and data expiry rules. Cymbal Bank B. Create a multi-region BigQuery dataset and dual-region Cloud Storage for high availability. Implement
will also engage external analysts to build Identity and Access Management (IAM) controls on a service account with
customized reports on BigQuery and Cloud bigquery.rowAccessPolicies.getFilteredData permissions. Configure a Compute Engine instance to use
this service account. Provide external analysts with access to this Compute Engine instance.
Storage buckets. The external analysts must be
able to run commands such as gsutil and bq C. Create a multi-region BigQuery dataset and dual-region Cloud Storage for high availability. Implement
from their command-line interfaces (CLIs), but Identity and Access Management (IAM) controls on a Compute Engine instance and provide all
bigquery.datasets.* permissions. Create a Google group and provide access to the Compute Engine
they should not be able to copy the tables to
instance. Add all the external analysts to this group.
any public storage.
D. Create the BigQuery dataset and Cloud Storage bucket in Europe. Implement VPC Service Controls.
Define the service perimeter to include a Cloud Storage bucket, BigQuery tables, and a Compute
How should you provide this access Engine instance. Configure the Compute Engine instance to connect to BigQuery and Cloud Storage.
without violating the GDPR compliance? Provide external analysts with SSH access to the Compute Engine instance.
5.1 Diagnostic Question 04
Cymbal Bank’s Insurance Analyst needs to
collect and store anonymous protected A. Create a new folder. Create a new Cloud Storage bucket in this folder. Give the Insurance
health information of patients from various Analyst the ‘Editor’ role on the new folder. Collect all hospital data in this bucket. Use the
hospitals. The information is currently Google Cloud Healthcare Data Protection Toolkit to monitor this bucket.
stored in Cloud Storage, where each B. Create a new Project. Create a new Cloud Storage bucket in this Project with
hospital has a folder that contains its own customer-supplied encryption keys (CSEK). Give the Insurance Analyst the ‘Reader’ role on
bucket. You have been tasked with the Project that contains the Cloud Storage bucket. Use the Cloud DLP API to find and mask
collecting and storing the healthcare data personally identifiable information (PII) data to comply with HIPAA.
from these buckets into Cymbal Bank’s C. Create a new Project. Use the Google Cloud Healthcare Data Protection Toolkit to set up a
Cloud Storage bucket while maintaining collection bucket, monitoring alerts, audit log sinks, and Forseti monitoring resources. Use
HIPAA compliance. Dataflow to read the data from source buckets and write to the new collection buckets. Give
the Insurance Analyst the ‘Editor’ role on the collection bucket.
D. Use the Cloud Healthcare API to read the data from the hospital buckets and use
de-identification to redact the sensitive information. Use Dataflow to ingest the Cloud
What should you do? Healthcare API feed and write data in a new Project that contains the Cloud Storage bucket.
Give the Insurance Analyst the ‘Editor’ role on this Project.
5.1 Diagnostic Question 05
Cymbal Bank plans to launch a new public
website where customers can pay their
equated monthly installments (EMI) using A. Create a new Google Cloud account with restricted access (separate from production
credit cards. You need to build a secure environment) for the payment processing solution. Create a new Compute Engine instance
payment processing solution using Google and configure firewall rules, a VPN tunnel, and an internal load balancer.
Cloud which should follow the PCI-DSS B. Create a new Google Cloud account with restricted access (separate from production
isolation requirements. How would you environment) for the payment processing solution. Configure firewall rules, a VPN tunnel,
architect a secure payment processing and an SSL proxy load balancer for a new App Engine flexible environment.
environment with Google Cloud services to C. Create a new Google Cloud account with restricted access (separate from production
follow PCI-DSS? environment) for the payment processing solution. Configure firewall rules, a VPN tunnel,
and an HTTP(S) load balancer for a new Compute Engine instance.
D. Deploy an Ubuntu Compute Engine instance. Install the libraries needed for payment
solutions and encryption/decryption. Deploy using Cloud Deployment Manager.

Select the two correct choices E. Deploy a Linux base image from preconfigured operating system images. Install only the
libraries you need. Deploy using Cloud Deployment Manager.
51
. Ensuring compliance

Documentation

Upload an object by using CSEK | Cloud Storage Cloud DLP client libraries | Data Loss Prevention
Customer-managed encryption keys (CMEK) | Documentation
Cloud KMS Documentation Data Loss Prevention Demo
Customer-supplied encryption keys | Cloud Storage Overview of VPC Service Controls | Google Cloud
Data encryption options | Cloud Storage Getting to know the Google Cloud Healthcare API: Part 1
ISO/IEC 27018 Certified Compliant | Google Cloud Sharing and collaboration | Cloud Storage
Automating the Classification of Data Uploaded to Google Cloud Platform HIPAA overview guide
Cloud Storage | Cloud Architecture Center | Google Setting up a HIPAA-aligned project | Cloud Architecture
Cloud Center
PCI Data Security Standard compliance | Cloud
Architecture Center
When will you take the exam?

How many weeks do you have to


Plan time prepare?

to prepare How many hours will you spend


preparing for the exam each week?

How many total hours will you


prepare?
Proprietary + Confidential

Sample study plan


Week 1 Week 2 Week 3 Week 4 Week 5 Week 6

Google Cloud Networking in Networking in Build and secure Managing Security Security Best
Fundamentals: Google Cloud: Google Cloud: networks in Google in Google Cloud Practices in
Core Infrastructure Defining and Hybrid connectivity Cloud Google Cloud
implementing and network Skill Badge
networks management

Week 7 Week 8 Week 9 Week 10 Week 11 Week 12

Mitigating Security Ensure Access & Secure Workloads Review Sample questions Take the
Vulnerabilities on Identity in Google in Google documentation certification exam
Google Cloud Cloud Skill Badge Kubernetes Engine
Skill Badge
Weekly study plan

Now, consider what you’ve learned about your knowledge and skills
through the diagnostic questions in this course. You should have a
better understanding of what areas you need to focus on and what
resources are available.

Use the template that follows to plan your study goals for each week.
Consider:
● What exam guide section(s) or topic area(s) will you focus on?
● What courses (or specific modules) will help you learn more?
● What Skill Badges or labs will you work on for hands-on practice?
● What documentation links will you review?
● What additional resources will you use - such as sample
questions?
You may do some or all of these study activities each week.

Duplicate the weekly template for the number of weeks in your


individual preparation journey.
Weekly study template (example)

Area(s) of focus: Managing service accounts

Courses/modules Managing Security in Google Cloud M3 Identity and Access Management


to complete: Security Best Practices in Google Cloud M1 Securing Compute Engine, M4 Securing Kubernetes

Skill Badges/labs Ensure Access and Identity in Google Cloud Quest


to complete:

Documentation Service accounts | IAM Documentation | Google Cloud


to review: Creating short-lived service account credentials | IAM Documentation | Google Cloud
Restricting service account usage | Resource Manager Documentation | Google Cloud

Additional study: Sample questions 1-3


Weekly study template

Area(s) of focus:

Courses/modules
to complete:

Skill Badges/labs
to complete:

Documentation
to review:

Additional study:

You might also like