PCSE Workbook
PCSE Workbook
Professional Cloud
Security Engineer
Journey
Course Workbook
Certification Exam Guide Sections
1 Configuring access within a cloud solution environment
5 Ensuring compliance
Section 1:
Configuring access within a
cloud solution environment
1.1 Diagnostic Question 01
Cymbal Bank has acquired a
non-banking financial company A. Run Microsoft System Center Configuration Manager (SCCM) on a Compute Engine instance. Leave the
(NBFC). This NBFC uses Active channel unencrypted because you are in a secure Google Cloud environment. Deploy Google Cloud
Directory as their central directory Directory Sync on the Compute Engine instance. Connect to the on-premises Windows Server
on an on-premises Windows environment from the instance, and migrate users to Cloud Identity.
Server. You have been tasked with
B. Run Configuration Manager on a Compute Engine instance. Copy the resulting configuration file from this
migrating all the NBFC users and
machine onto a new Compute Engine instance to keep the production environment separate from the
employee information to Cloud staging environment. Leave the channel unencrypted because you are in a secure Google Cloud
Identity. environment. Deploy Google Cloud Directory Sync on this new instance. Connect to the on-premises
Windows Server environment from the new instance, and migrate users to Cloud Identity.
C. Use Cloud VPN to connect the on-premises network to your Google Cloud environment. Select an
on-premises domain-joined Windows Server. On the domain-joined Windows Server, run Configuration
What should you do?
Manager and Google Cloud Directory Sync. Use Cloud VPN’s encrypted channel to transfer users from
the on-premises Active Directory to Cloud Identity.
D. Select an on-premises domain-joined Windows Server. Run Configuration Manager on the domain-joined
Windows Server, and copy the resulting configuration file to a Compute Engine instance. Run Google
Cloud Directory Sync on the Compute Engine instance over the internet, and use Cloud VPN to sync
users from the on-premises Active Directory to Cloud Identity.
1.1 Diagnostic Question 02
Cymbal Bank has certain default
permissions and access for their analyst, A. Leave all user permissions as-is in the small bank’s IAM. Use the Directory
finance, and teller teams. These teams are API in the Google Workspace Admin SDK to create Google Groups. Use a
organized into groups that have a set of Python script to allocate users to the Google Groups.
role-based IAM permissions assigned to B. Reset all user permissions in the small bank’s IAM. Use Cloud Identity to
them. After a recent acquisition of a small create dynamic groups for each of the bank’s teams. Use the dynamic
bank, you find that the small bank directly groups’ metadata field for team type to allocate users to their appropriate
assigns permissions to their employees in group with a Python script.
IAM. You have been tasked with applying
Cymbal Bank’s organizational structure to C. Reset all user permissions in the small bank’s IAM. Use Cloud Identity to
the small bank. Employees will need access create the required Google Groups. Upgrade the Google Groups to
to Google Cloud services. Security Groups. Use a Python script to allocate users to the groups.
D. Reset all user permissions in the small bank’s IAM. Use the Directory API in
the Google Workspace Admin SDK to create Google Groups. Use a Python
What should you do?
script to allocate users to the groups.
Proprietary + Confidential
Courses Documentation
Active Directory user account provisioning |
Identity and access management | Google Cloud
What is Configuration Manager? - Google
Security in Google Cloud Managing Security in Google Cloud Workspace Admin Help
● M2 Cloud Identity ● M2 Cloud Identity
Manage membership automatically with dynamic
groups - Google Workspace Admin Help
Creating and updating a dynamic group | Cloud
Identity
Create and manage groups using APIs - Google
Workspace Admin Help
1.2 Diagnostic Question 03
Cymbal Bank leverages Google Cloud
storage services, an on-premises A. Create a service account with appropriate permissions. Authenticate the Spark
Apache Spark Cluster, and a web Cluster and the web application as direct requests and share the service
application hosted on a third-party account key.
cloud. The Spark cluster and web B. Create a service account with appropriate permissions. Have the Spark
application require limited access to Cluster and the web application authenticate as delegated requests, and share
Cloud Storage buckets and a Cloud the short-lived service account credential as a JWT.
SQL instance for only a few hours per
day. You have been tasked with C. Create a service account with appropriate permissions. Authenticate the Spark
sharing credentials while minimizing Cluster and the web application as a delegated request, and share the service
the risk that the credentials will be account key.
compromised. D. Create a service account with appropriate permissions. Have the Spark
Cluster and the web application authenticate as a direct request, and share the
short-lived service account credentials as XML tokens.
What should you do?
1.2 Diagnostic Question 04
Cymbal Bank’s Mobile Development A. Create a custom role for auditors at the Organization level. Create a JSON file with
Team has an AI Platform instance in a required permissions ml.models.list and ml.jobs.list. Use gIAM roles create
Google Cloud Project. An auditor needs role-id -- organization organization-id --file=json-file-path.
to record the AI Platform jobs and B. Create a custom role for auditors at the Project level. Create a YAML file with
models, along with their usage. You need required permissions ml.models.list and ml.jobs.list. Use gIAM roles
to assign permissions to the external create role-id --project project-id --file=yaml-file-path.
auditors so that they can view the models
and jobs but not retrieve specific details C. Create a custom role for auditors at the Project level. Use gIAM roles create
on any of them. role-name --project project-id --permissions= ml.models.get,
ml.jobs.get.
D. Create a custom role for auditors at the Organization level. Create a JSON file with
required permissions ml.models.list and ml.jobs.list. Use gIAM role
What should you do?
create role-id --organization organization-id
--file=json-file-path.
Proprietary + Confidential
Courses Documentation
Understanding hierarchy evaluation | Resource
Manager Documentation | Google Cloud
Creating and managing organizations | Resource
Security in Google Cloud Managing Security in Google Cloud Manager Documentation | Google Cloud
● M2 Cloud Identity ● M2 Cloud Identity
● M3 Identity and Access Management ● M3 Identity and Access Management Best practices for enterprise organizations |
(IAM) (IAM) Documentation | Google Cloud
Section 2:
Configuring network security
Proprietary + Confidential
What should you do? D. Import a self-managed SSL certificate. Attach a global static external IP address to the SSL Proxy load
balancer. Validate that an existing URL map will route the incoming service to your managed instance
group backend. Load your certificate and create an SSL proxy routing to your URL map. Create a
global forwarding rule that routes incoming requests to the proxy.
2.1 Diagnostic Question 03
Your organization has a website A. Set up Cloud VPN. Set up an unencrypted tunnel to one of the hosts in
running on Compute Engine. This the network. Create outbound or egress firewall rules. Use the private
instance only has a private IP IP address to log in using a gcloud ssh command.
address. You need to provide SSH B. Use SOCKS proxy over SSH. Set up an SSH tunnel to one of the
access to an on-premises hosts in the network. Create the SOCKS proxy on the client side.
developer who will debug the
C. Use the default VPC’s firewall. Open port 22 for TCP protocol using
website from the authorized
the Google Cloud Console.
on-premises location only.
D. Use Identity-Aware Proxy (IAP). Set up IAP TCP forwarding by
creating ingress firewall rules on port 22 for TCP using the gcloud
How do you enable this?
command.
Proprietary + Confidential
A. Create user accounts for the application and database. Create a firewall rule using:
gcloud compute firewall-rules create ALLOW_MONGO_DB
--network network-name
--deny UDP:27017
--source-service-accounts web-application-user-account
--target-service-accounts database-admin-user-account
2.2 Diagnostic Question 06
Cymbal Bank has designed an
application to detect credit card fraud A. Use subnet isolation. Create a service account for the fraud detection VM. Create one service account for all the teams’
that will analyze sensitive information. Compute Engine instances that will access the fraud detection VM. Create a new firewall rule using:
The application that’s running on a gcloud compute firewall-rules create ACCESS_FRAUD_ENGINE
--network <network name>
Compute Engine instance is hosted in a
--allow TCP:80
new subnet on an existing VPC. --source-service-accounts <one service account for all teams>
Multiple teams who have access to --target-service-accounts <fraud detection engine’s service account>
other VMs in the same VPC must
access the VM. You want to configure B. Use target filtering. Create two tags called ‘app’ and ‘data’. Assign the ‘app’ tag to the Compute Engine instance hosting the
the access so that unauthorized VMs or Fraud Detection App (source), and assign the ‘data’ tag to the other Compute Engine instances (target). Create a firewall rule
users from the internet can’t access the to allow all ingress communication on this tag.
fraud detection VM.
C. Use subnet isolation. Create a service account for the fraud detection engine. Create service accounts for each of the teams’
Compute Engine instances that will access the engine. Add a firewall rule using:
gcloud compute firewall-rules create ACCESS_FRAUD_ENGINE
--network <network name>
What should you do? --allow TCP:80
--source-service-accounts <list of service accounts>
--target-service-accounts <fraud detection engine’s service account>
D. Use target filtering. Create a tag called ‘app’, and assign the tag to both the source and the target. Create a firewall rule to
allow all ingress communication on this tag.
Proprietary + Confidential
Configuring network
2.2 segmentation
A. Add ingress firewall rules to allow NAT and Health Check ranges for the App Engine standard environment in the
Shared VPC network. Create a client-side connector in the Service Project using the Shared VPC Project ID. Verify
What should you do? that the connector is in a READY state. Create an ingress rule on the Shared VPC network to allow the connector
using Network Tags or IP ranges.
A. Add ingress firewall rules to allow NAT and Health Check ranges for App Engine standard environment in the Shared
VPC network. Create a server-side connector in the Host Project using the Shared VPC Project ID. Verify that the
connector is in a READY state. Create an ingress rule on the Shared VPC network to allow the connector using
Network Tags or IP ranges.
2.3 Diagnostic Question 08
Cymbal Bank’s Customer Details API runs on a A. Use a Content Delivery Network (CDN). Establish direct peering with
Compute Engine instance with only an internal one of Google’s nearby edge-enabled PoPs.
IP address. Cymbal Bank’s new branch is B. Use Carrier Peering. Use a service provider to access their enterprise
co-located outside the Google Cloud grade infrastructure to connect to the Google Cloud environment.
points-of-presence (PoPs) and requires a
low-latency way for its on-premises apps to C. Use Partner Interconnect. Use a service provider to access their
consume the API without exposing the requests enterprise grade infrastructure to connect to the Google Cloud
to the public internet. environment.
D. Use Dedicated Interconnect. Establish direct peering with one of
Google’s nearby edge-enabled PoPs.
Which solution would you recommend?
2.3 Diagnostic Question 9
An external audit agency needs to perform A. Use a Cloud VPN tunnel. Use your DNS provider to create DNS zones and records for
a one-time review of Cymbal Bank’s private.googleapis.com. Connect the DNS provider to your on-premises network. Broadcast
Google Cloud usage. The auditors should the request from the on-premises environment. Use a software-defined firewall to manage
be able to access a Default VPC incoming and outgoing requests.
containing BigQuery, Cloud Storage, and B. Use Partner Interconnect. Configure an encrypted tunnel in the auditor's on-premises
Compute Engine instances where all the environment. Use Cloud DNS to create DNS zones and A records for
usage information is stored. You have private.googleapis.com.
been tasked with enabling the access from
their on-premises environment, which C. Use a Cloud VPN tunnel. Use Cloud DNS to create DNS zones and records for
already has a configured VPN. *.googleapis.com. Set up on-premises routing with Cloud Router. Use Cloud Router custom
route advertisements to announce routes for Google Cloud destinations.
D. Use Direct Interconnect. Configure a VLAN in the auditor's on-premises environment. Use
What should you do? Cloud DNS to create DNS zones and records for restricted.googleapis.com and
private.googleapis.com. Set up on-premises routing with Cloud Router. Add custom static
routes in the VPC to connect individually to BigQuery, Cloud Storage, and Compute Engine
instances.
2.3 Diagnostic Question 10
An external audit agency needs to perform A. Cloud DNS, subnet primary IP address range for nodes, and subnet
a one-time review of Cymbal Bank’s secondary IP address range for pods and services in the cluster
Google Cloud usage. The auditors should B. Cloud VPN, subnet secondary IP address range for nodes, and subnet
be able to access a Default VPC secondary IP address range for pods and services in the cluster
containing BigQuery, Cloud Storage, and
Compute Engine instances where all the C. Nginx load balancer, subnet secondary IP address range for nodes, and
usage information is stored. You have subnet secondary IP address range for pods and services in the cluster
been tasked with enabling the access from D. Cloud NAT gateway, subnet primary IP address range for nodes, and subnet
their on-premises environment, which secondary IP address range for pods and services in the cluster
already has a configured VPN.
How would you configure access in the D. Use VPC Service Controls with Context-aware access with ingress rules. Use the
vendor Projects so that vendors can’t command gcloud access-context-manager perimeters update and set
communicate with each other, but can still ingress rules for the bank’s bucket in the vendor’s Cloud Storage buckets separately.
copy the data from the bank’s Cloud Storage Use IAM to provide appropriate permissions.
bucket?
3.1 Diagnostic Question 05
Cymbal Bank has a Cloud SQL instance A. Use Secret Manager. Use the duration attribute to set the expiry period to one year. Add
that must be shared with an external the secretmanager.secretAccessor role for the group that contains external developers.
agency. The agency’s developers will be
B. Use Cloud Key Management Service. Use the destination IP address and Port attributes
assigned roles and permissions through a to provide access for developers at the external agency. Remove the IAM access after one
Google Group in Identity and Access year and rotate the shared keys. Add cloudkms.cryptoKeyEncryptorDecryptor role for the
Management (IAM). The external agency group that contains the external developers.
is on an annual contract and will require a
connection string, username, and C. Use Secret Manager. Use the resource attribute to set a key-value pair with key as
password to connect to the database. duration and values as expiry period one year from now. Add secretmanager.viewer role
for the group that contains external developers.
D. Use Secret Manager for the connection string and username, and use Cloud Key
Management Service for the password. Use tags to set the expiry period to the timestamp
How would you configure the
one year from now. Add secretmanager.secretVersionManager and
group’s access?
secretmanager.secretAccessor roles for the group that contains external developers.
3.1 Diagnostic Question 06
Cymbal Bank wants to deploy an n-tier web A. Use VM metadata to read the current machine’s IP address, and use a gcloud command
application. The frontend must be to add access to Cloud SQL. Store Cloud SQL’s connection string and password in Cloud
supported by an App Engine deployment, Key Management Service. Store the Username in Project metadata.
an API with a Compute Engine instance, B. Use Project metadata to read the current machine’s IP address, and use a startup script
and Cloud SQL for a MySQL database. to add access to Cloud SQL. Store Cloud SQL’s connection string in Cloud Key
This application is only supported during Management Service, and store the password in Secret Manager. Store the Username in
working hours, App Engine is disabled, and Project metadata.
Compute Engine is stopped. How would
you enable the infrastructure to access the C. Use Project metadata to read the current machine’s IP address and use a gcloud
database? command to add access to Cloud SQL. Store Cloud SQL’s connection string and
username in Cloud Key Management Service, and store the password in Secret
Manager.
D. Use VM metadata to read the current machine’s IP address and use a startup script to
add access to Cloud SQL. Store Cloud SQL’s connection string, username, and
password in Secret Manager.
How would you enable the infrastructure to
access the database?
Proprietary + Confidential
Documentation
3.1 Protecting sensitive data Image inspection and redaction | Data Loss
Prevention Documentation | Google Cloud
Redacting sensitive data from images | Data
Loss Prevention Documentation | Google Cloud
InfoType detector reference | Data Loss
Courses Prevention Documentation | Google Cloud
Pseudonymization | Data Loss Prevention
Documentation | Google Cloud
Security in Google Cloud Managing Security in Google Cloud
● M4 Configuring Virtual Private ● M4 Configuring Virtual Private Cloud Authorized views | BigQuery | Google Cloud
Cloud for Isolation and Security for Isolation and Security Authorized datasets | BigQuery | Google Cloud
● M5 Securing Compute Engine:
Security Best Practices in Google Cloud Sharing across perimeters with bridges | VPC
Techniques and Best Practices Service Controls | Google Cloud
● M1 Securing Compute Engine:
● M6 Securing Cloud Data:
Techniques and Best Practices Creating a perimeter bridge | VPC Service
Techniques and Best Practices
● M2 Securing Cloud Data: Controls | Google Cloud
● M7 Application Security:
Techniques and Best Practices
Techniques and Best Practices Context-aware access with ingress rules | VPC
● M3 Application Security: Techniques
● M10 Content-Related Service Controls | Google Cloud
and Best Practices
Vulnerabilities: Techniques and Frequently asked questions | Cloud IAM
Best Practices Mitigating Security Vulnerabilities in
Documentation
Google Cloud
● M3 Monitoring, Logging, Auditing, Access control with IAM | Secret Manager
and Scanning Documentation | Google Cloud
About VM metadata | Compute Engine
Documentation | Google Cloud
3.2 Diagnostic Question 07
Cymbal Bank calculates employee
incentives on a monthly basis for the A. Import the spreadsheets to BigQuery, and create separate tables for Sales and
sales department and on a quarterly Marketing. Set table expiry rules to 365 days for both tables. Create jobs scheduled to
basis for the marketing department. The run every quarter for Marketing and every month for Sales.
incentives are released with the next B. Upload the spreadsheets to Cloud Storage. Select the Nearline storage class for the
month’s salary. Employee’s sales department and Coldline storage for the marketing department. Use object lifecycle
performance documents are stored as management rules to set the storage class to Archival after 365 days. Process the data
spreadsheets, which are retained for at on BigQuery using jobs that run monthly for Sales and quarterly for Marketing.
least one year for audit. You want to
C. Import the spreadsheets to Cloud SQL, and create separate tables for Sales and
configure the most cost-effective
Marketing. For Table Expiration, set 365 days for both tables. Use stored procedures to
storage for this scenario.
calculate incentives. Use App Engine cron jobs to run stored procedures monthly for
Sales and quarterly for Marketing.
D. Import the spreadsheets into Cloud Storage and create NoSQL tables. Use App Engine
What should you do? cron jobs to run monthly for Sales and quarterly for Marketing. Use a separate job to
delete the data after 1 year.
3.2 Diagnostic Question 08
Cymbal Bank uses Google Kubernetes A. In the Google Cloud console, navigate to Google Kubernetes Engine. Select your
Engine (GKE) to deploy its Docker cluster and the boot node inside the cluster. Enable customer-managed encryption.
containers. You want to encrypt the boot Use Cloud HSM to generate random bytes and provide an additional layer of security.
disk for a cluster running a custom image B. Create a new GKE cluster with customer-managed encryption and HSM enabled.
so that the key rotation is controlled by Deploy the containers to this cluster. Delete the old GKE cluster. Use Cloud HSM to
the Bank. GKE clusters will also generate generate random bytes and provide an additional layer of security.
up to 1024 randomized characters that
will be used with the keys with Docker C. Create a new key ring using Cloud Key Management Service. Extract this key to a
containers. certificate. Use the kubectl command to update the Kubernetes configuration. Validate
using MAC digital signatures, and use a startup script to generate random bytes.
D. Create a new key ring using Cloud Key Management Service. Extract this key to a
certificate. Use the Google Cloud Console to update the Kubernetes configuration.
What steps would you take to apply the Validate using MAC digital signatures, and use a startup script to generate random
encryption settings with a dedicated bytes.
hardware security layer?
3.2 Diagnostic Question 09
Cymbal Bank has an equated monthly A. Use manual key rotation and assign yourself the
installment (EMI) application. This cloudkms.cryptoKeyEncrypterDecrypter role.
application must comply with PCI-DSS B. Use automatic key rotation and assign yourself the
standards because it stores credit card cloudkms.cryptoKeyEncrypterDecrypter role.
information. For additional security, you use
asymmetric keys to encrypt the data and C. Use automatic key rotation and assign yourself the cloudkms.admin role.
rotate the keys at fixed intervals. Cymbal D. Use manual key rotation and assign yourself the cloudkms.admin role.
Bank has recently migrated to Google
Cloud, and you need to set up key rotation.
Cymbal Bank has received Docker A. Prepare a cloudbuild.yaml file. In this file, add four steps in order—build, scan, severity
source files from its third-party check, and push—specifying the location of Artifact Registry repository. Specify severity
developers in an Artifact Registry level as CRITICAL. Start the build with the command gcloud builds submit.
repository. These Docker fil2es will be
part of a CI/CD pipeline to update B. Prepare a cloudbuild.yaml file. In this file, add four steps in order—scan, build, severity
Cymbal Bank’s personal loan offering. check, and push—specifying the location of the Artifact Registry repository. Specify
The bank wants to prevent the severity level as HIGH. Start the build with the command gcloud builds submit.
possibility of remote users arbitrarily
using the Docker files to run any
A. Prepare a cloudbuild.yaml file. In this file, add four steps in order—scan, severity check,
code. You have been tasked with
build, and—push specifying the location of the Artifact Registry repository. Specify
using Container Analysis’
severity level as HIGH. Start the build with the command gcloud builds submit.
On-Demand scanning to scan the
images for a one-time update.
A. Prepare a cloudbuild.yaml file. In this file, add four steps in order—build, severity check,
scan, and push—specifying the location of the Artifact Registry repository. Specify
What should you do?
severity level as CRITICAL. Start the build with the command gcloud builds submit.
4.1 Diagnostic Question 02
Cymbal Bank’s management is A. Set an organization-level policy that requires all Compute Engine VMs to be configured as
concerned about virtual machines Shielded VMs. Use Secure Boot enabled with Unified Extensible Firmware Interface (UEFI).
being compromised by bad actors. Validate integrity events in Cloud Monitoring and place alerts on launch attestation events.
More specifically, they want to B. Set Cloud Logging measurement policies on the VMs. Use Cloud Logging to place alerts
receive immediate alerts if there whenever actualMeasurements and policyMeasurements don’t match.
have been changes to the boot
sequence of any of their Compute C. Set an organization-level policy that requires all Compute Engine VMs to be configured as
Engine instances. Shielded VMs. Use Measured Boot enabled with Virtual Trusted Platform Module (vTPM).
Validate integrity events in Cloud Monitoring and place alerts on late boot validation events.
D. Set project-level policies that require all Compute Engine VMs to be configured as Shielded VMs.
Use Measured Boot enabled with Virtual Trusted Platform Module (vTPM). Validate integrity
What should you do?
events in Cloud Monitoring and place alerts on late boot validation events.
4.1 Diagnostic Question 03
Cymbal Bank runs a Node.js
application on a Compute Engine
instance. Cymbal Bank needs to A. Prepare a shell script. Add the command gcloud compute instances stop with the Node.js instance
share this base image with a name. Set up certificates for secure boot. Add gcloud compute images create, and specify the
‘development’ Google Group. This Compute Engine instance’s persistent disk and zone and the certificate files. Add gcloud compute
base image should support secure images add-iam-policy-binding and specify the ‘development’ group.
boot for the Compute Engine B. Start the Compute Engine instance. Set up certificates for secure boot. Prepare a cloudbuild.yaml
instances deployed from this configuration file. Specify the persistent disk location of the Compute Engine and the ‘development’
image. How would you automate group. Use the command gcloud builds submit --tag, and specify the configuration file path and the
the image creation? certificates.
C. Prepare a shell script. Add the command gcloud compute instances start to the script to start the
Node.js Compute Engine instance. Set up Measured Boot for secure boot. Add gcloud compute
images create, and specify the persistent disk and zone of the Compute Engine instance.
How would you automate D. Stop the Compute Engine instance. Set up Measured Boot for secure boot. Prepare a
the image creation? cloudbuild.yaml configuration file. Specify the persistent disk location of the Compute Engine
instance and the ‘development’ group. Use the command gcloud builds submit --tag, and specify
the configuration file path.
4.1 Diagnostic Question 04
Cymbal Bank uses Docker A. Create a Dockerfile with container definition and cloudbuild.yaml file.
containers to interact with APIs for Use Cloud Build to build the image from Dockerfile. Upload the built
its personal banking application. image to a Google Container registry and Dockerfile to a Git repository.
These APIs are under PCI-DSS In the cloudbuild.yaml template, include attributes to tag the Git repository
compliance. The Kubernetes path with a Google Kubernetes Engine cluster. Create a trigger in Cloud Build to automate the deployment
environment running the using the Git repository.
containers will not have internet B. Create a Dockerfile with a container definition and a Cloud Build configuration file. Use the Cloud Build
access to download required configuration file to build and deploy the image from Dockerfile to a Google Container registry. In the
packages. configuration file, include the Google Container Registry path and the Google Kubernetes Engine cluster.
Upload the configuration file to a Git repository. Create a trigger in Cloud Build to automate the deployment
using the Git repository.
C. Build a foundation image. Store all artifacts and a Packer definition template in a Git repository. Use
Container Registry to build the artifacts and Packer definition. Use Cloud Build to extract the built container
How would you automate the and deploy it to a Google Kubernetes Engine (GKE) cluster. Add the required users and groups to the GKE
project.
pipeline that is building these
containers? D. Build an immutable image. Store all artifacts and a Packer definition template in a Git repository. Use
Container Registry to build the artifacts and Packer definition. Use Cloud Build to extract the built container
and deploy it to a Google Kubernetes Engine Cluster (GKE). Add the required users and groups to the GKE
project.
Proprietary + Confidential
Cymbal Bank wants to use A. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE at
Cloud Storage and BigQuery to the service level for BigQuery and Cloud Storage.
store safe deposit usage data. B. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE at
Cymbal Bank needs a the organization level.
cost-effective approach to
auditing only Cloud Storage and C. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE for
BigQuery data access activities. Cloud Storage. All Data Access Logs are enabled for BigQuery by default.
D. Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE for
BigQuery. All Data Access Logs are enabled for Cloud Storage by default.
How would you use Cloud Audit
Logs to enable this analysis?
4.2 Diagnostic Question 08
Cymbal Bank has suffered a remote A. Use Event Threat Detection. Trigger the IAM Anomalous Grant detector to detect all admins
botnet attack on Compute Engine and users with admin or system permissions. Export these logs to the Security Command
instances in an isolated project. The Center. Give the external agency access to the Security Command Center.
affected project now requires
B. Use Cloud Audit Logs. Filter Admin Activity audit logs for only the affected project. Use a
investigation by an external agency.
Pub/Sub topic to stream the logs from Cloud Audit Logs to the external agency’s forensics
An external agency requests that you
tool.
provide all admin and system events
to analyze in their local forensics tool. C. Use the Security Command Center. Select Cloud Logging as the source, and filter by
You want to use the most category: Admin Activity and category: System Activity. View the Source property of the
cost-effective solution to enable the Finding Details section. Use Pub/Sub topics to export the findings to the external agency’s
external analysis. forensics tool.
D. Use Cloud Monitoring and Cloud Logging. Filter Cloud Monitoring to view only system and
What should you do? admin logs. Expand the system and admin logs in Cloud Logging. Use Pub/Sub to export the
findings from Cloud Logging to the external agency’s forensics tool or storage.
4.2 Diagnostic Question 09
The loan application from Cymbal A. Set up a logging export dataset in BigQuery to collect data from Cloud Logging and
Bank’s lending department collects Cloud Monitoring. Create table expiry rules to delete logs after three years.
credit reports that contain credit B. Set up a logging export dataset in BigQuery to collect data from Cloud Logging and
payment information from customers. the Security Command Center. Create table expiry rules to delete logs after three
According to bank policy, the PDF years.
reports are stored for six months in
Cloud Storage, and access logs for the C. Set up a logging export bucket in Cloud Storage to collect data from the Security
reports are stored for three years. You Command Center. Configure object lifecycle management rules to delete logs after
need to configure a cost-effective three years.
storage solution for the access logs. D. Set up a logging export bucket in Cloud Storage to collect data from Cloud Audit
Logs. Configure object lifecycle management rules to delete logs after three years.
How should you configure the D. Enable the Web Security Scanner in the Security Command Center. Perform a scan of Compute
detection and notification? Engine instances. Publish results to Cloud Audit Logging. Create an alert in Cloud Monitoring to
send notifications for suspect activities.
Proprietary + Confidential
Configuring logging,
4.2 monitoring, and detection Documentation
Security controls and forensic analysis for GKE
apps | Cloud Architecture Center
Courses Scenarios for exporting logging data: Security
and access analytics | Cloud Architecture Center
| Google Cloud
Security in Google Cloud Mitigating Security Vulnerabilities in
● M11 Monitoring, Logging, Google Cloud Security controls and forensic analysis for GKE
Auditing, and Scanning ● M3 Monitoring, Logging, Auditing, apps | Cloud Architecture Center
and Scanning
Cloud Audit Logs overview
Cloud Audit Logs with Cloud Storage | Google
Cloud
Configure Data Access audit logs
Scenarios for exporting Cloud Logging:
Compliance requirements | Cloud Architecture
Center | Google Cloud
Security sources for vulnerabilities and threats |
Security Command Center | Google Cloud
Configuring Security Command Center
Enabling real-time email and chat notifications
Section 5:
Ensuring compliance
5.1 Diagnostic Question 01
Cymbal Bank’s lending department stores A. Generate an AES-256 key as a 32-byte bytestring. Decode
sensitive information, such as your customers’ it as a base-64 string. Upload the blob to the bucket using
credit history, address and phone number, in this key.
parquet files. You need to upload this B. Generate an RSA key as a 32-byte bytestring. Decode it as
personally identifiable information (PII) to a base-64 string. Upload the blob to the bucket using this
Cloud Storage so that it’s secure and key.
compliant with ISO 27018. C. Generate a customer-managed encryption key (CMEK)
using RSA or AES256 encryption. Decode it as a base-64
string. Upload the blob to the bucket using this key.
How should you protect this sensitive
information using Cymbal Bank’s encryption D. Generate a customer-managed encryption key (CMEK)
using Cloud KMS. Decode it as a base-64 string. Upload
keys and using the least amount of
the blob to the bucket using this key.
computational resources?
5.1 Diagnostic Question 02
You are designing a web application for A. Use customer-supplied encryption keys (CSEK) and Cloud
Cymbal Bank so that customers who have Key Management Service (KMS) to detect and encrypt
credit card issues can contact dedicated sensitive information.
support agents. Customers may enter their B. Detect sensitive information with Cloud Natural Language
complete credit card number when chatting API.
with or emailing support agents. You want to C. Use customer-managed encryption keys (CMEK) and Cloud
ensure compliance with PCI-DSS and prevent Key Management Service (KMS) to detect and encrypt
support agents from viewing this information sensitive information.
in the most cost-effective way. D. Implement Cloud Data Loss Prevention using its REST API.
Select the two correct choices E. Deploy a Linux base image from preconfigured operating system images. Install only the
libraries you need. Deploy using Cloud Deployment Manager.
51
. Ensuring compliance
Documentation
Upload an object by using CSEK | Cloud Storage Cloud DLP client libraries | Data Loss Prevention
Customer-managed encryption keys (CMEK) | Documentation
Cloud KMS Documentation Data Loss Prevention Demo
Customer-supplied encryption keys | Cloud Storage Overview of VPC Service Controls | Google Cloud
Data encryption options | Cloud Storage Getting to know the Google Cloud Healthcare API: Part 1
ISO/IEC 27018 Certified Compliant | Google Cloud Sharing and collaboration | Cloud Storage
Automating the Classification of Data Uploaded to Google Cloud Platform HIPAA overview guide
Cloud Storage | Cloud Architecture Center | Google Setting up a HIPAA-aligned project | Cloud Architecture
Cloud Center
PCI Data Security Standard compliance | Cloud
Architecture Center
When will you take the exam?
Google Cloud Networking in Networking in Build and secure Managing Security Security Best
Fundamentals: Google Cloud: Google Cloud: networks in Google in Google Cloud Practices in
Core Infrastructure Defining and Hybrid connectivity Cloud Google Cloud
implementing and network Skill Badge
networks management
Mitigating Security Ensure Access & Secure Workloads Review Sample questions Take the
Vulnerabilities on Identity in Google in Google documentation certification exam
Google Cloud Cloud Skill Badge Kubernetes Engine
Skill Badge
Weekly study plan
Now, consider what you’ve learned about your knowledge and skills
through the diagnostic questions in this course. You should have a
better understanding of what areas you need to focus on and what
resources are available.
Use the template that follows to plan your study goals for each week.
Consider:
● What exam guide section(s) or topic area(s) will you focus on?
● What courses (or specific modules) will help you learn more?
● What Skill Badges or labs will you work on for hands-on practice?
● What documentation links will you review?
● What additional resources will you use - such as sample
questions?
You may do some or all of these study activities each week.
Area(s) of focus:
Courses/modules
to complete:
Skill Badges/labs
to complete:
Documentation
to review:
Additional study: