[GCP] Google Cloud Certified - Professional Security Engineer

Ace Your Professional Cloud Security Engineer with Practice Exams.

Google Cloud Certified – Professional Cloud Security Engineer – Practice Exam (Question 51)


Question 1

Which two implied firewall rules are defined on a VPC network? (Choose two.)

  • A. A rule that allows all outbound connections.
  • B. A rule that denies all inbound connections.
  • C. A rule that blocks all inbound port 25 connections.
  • D. A rule that blocks all outbound connections.
  • E. A rule that allows all inbound port 80 connections.

Correct Answer: A, B

Reference contents:
VPC firewall rules overview | Google Cloud


Question 2

A customer needs an alternative to storing their plain text secrets in their source-code management (SCM) system.
How should the customer achieve this using Google Cloud Platform?

  • A. Use Google Cloud Source Repositories, and store secrets in Google Cloud SQL.
  • B. Encrypt the secrets with a Customer-Managed Encryption Key (CMEK), and store them in Google Cloud Storage.
  • C. Run the Google Cloud Data Loss Prevention API to scan the secrets, and store them in Google Cloud SQL.
  • D. Deploy the SCM to a Google Compute Engine VM with local SSDs, and enable preemptible VMs.

Correct Answer: B


Question 3

When creating a secure container image, which two items should you incorporate into the build if possible? (Choose two.)

  • A. Ensure that the app does not run as PID 1.
  • B. Package a single app as a container.
  • C. Remove any unnecessary tools not needed by the app.
  • D. Use public container images as a base image for the app.
  • E. Use many container image layers to hide sensitive information.

Correct Answer: B, C

Reference contents:
Best practices for building containers | Cloud Architecture Center | Google Cloud


Question 4

A customer needs to launch a 3-tier internal web application on Google Cloud Platform (GCP).
The customer’s internal compliance requirements dictate that end user access may only be allowed if the traffic seems to originate from a specific known good CIDR. The customer accepts the risk that their application will only have SYN flood DDoS protection. They want to use GCP’s native SYN flood protection.
Which product should be used to meet these requirements?

  • A. Google Google Cloud Armor
  • B. VPC Firewall Rules
  • C. Google Google Cloud Identity and Access Management
  • D. Google Cloud CDN

Correct Answer: A

Reference contents:
Google Cloud Armor adds WAF, telemetry features | Google Cloud


Question 5

A company is running workloads in a dedicated server room. They must only be accessed from within the private company network. You need to connect to these workloads from Google Compute Engine instances within a Google Cloud Platform project.
Which two approaches can you take to meet the requirements? (Choose two.)

  • A. Configure the project with Cloud VPN.
  • B. Configure the project with Shared VPC.
  • C. Configure the project with Google Cloud Interconnect.
  • D. Configure the project with VPC peering.
  • E. Configure all Google Compute Engine instances with Private Access.

Correct Answer: E, D

Reference contents:
Part 3: Help secure data workloads | Google Cloud


Question 6

A customer implements Google Google Cloud Identity-Aware Proxy for their ERP system hosted on Google Compute Engine. Their security team wants to add a security layer so that the ERP systems only accept traffic from Google Google Cloud Identity-Aware Proxy.
What should the customer do to meet these requirements?

  • A. Make sure that the ERP system can validate the JWT assertion in the HTTP requests.
  • B. Make sure that the ERP system can validate the identity headers in the HTTP requests.
  • C. Make sure that the ERP system can validate the x-forwarded-for headers in the HTTP requests.
  • D. Make sure that the ERP system can validate the user’s unique identifier headers in the HTTP requests.

Correct Answer: A


Question 7

A company has been running their application on Google Compute Engine.
A bug in the application allowed a malicious user to repeatedly execute a script that results in the Google Compute Engine instance crashing. Although the bug has been fixed, you want to get notified in case this hack re-occurs.
What should you do?

  • A. Create an Alerting Policy in Stackdriver using a Process Health condition, checking that the number of executions of the script remains below the desired threshold. Enable notifications.
  • B. Create an Alerting Policy in Stackdriver using the CPU usage metric. Set the threshold to 80% to be notified when the CPU usage goes above this 80%.
  • C. Log every execution of the script to Stackdriver Logging. Create a User-defined metric in Stackdriver Logging on the logs, and create a Stackdriver Dashboard displaying the metric.
  • D. Log every execution of the script to Stackdriver Logging. Configure Google BigQuery as a log sink, and create a Google BigQuery scheduled query to count the number of executions in a specific timeframe.

Correct Answer: C

Reference contents:
Overview of logs-based metrics | Cloud Logging | Google Cloud


Question 8

Your team needs to obtain a unified log view of all development cloud projects in your SIEM.
The development projects are under the NONPROD organization folder with the test and pre-production projects. The development projects share the ABC-BILLING billing account with the rest of the organization.
Which logging export strategy should you use to meet the requirements?

  • A.
    • Export logs to a Google Cloud Pub/Sub topic with folders/NONPROD parent and includeChildren property set to True in a dedicated SIEM project.
    • Subscribe SIEM to the topic.
  • B.
    • Create a Google Cloud Storage sink with billingAccounts/ABC-BILLING parent and includeChildren property set to False in a dedicated SIEM project.
    • Process Google Cloud Storage objects in SIEM.
  • C.
    • Export logs in each dev project to a Google Cloud Pub/Sub topic in a dedicated SIEM project.
    • Subscribe SIEM to the topic.
  • D.
    • Create a Google Cloud Storage sink with a publicly shared Google Cloud Storage bucket in each project.
    • Process Google Cloud Storage objects in SIEM.

Correct Answer: B


Question 9

A customer’s data science group wants to use Google Cloud Platform (GCP) for their analytics workloads.
Company policy dictates that all data must be company-owned and all user authentications must go through their own Security Assertion Markup Language (SAML) 2.0 Identity Provider (IdP). The Infrastructure Operations Systems Engineer was trying to set up Google Cloud Identity for the customer and realized that their domain was already being used by G Suite.
How should you best advise the Systems Engineer to proceed with the least disruption?

  • A. Contact Google Support and initiate the Domain Contestation Process to use the domain name in your new Google Cloud Identity domain.
  • B. Register a new domain name, and use that for the new Google Cloud Identity domain.
  • C. Ask Google to provision the data science manager’s account as a Super Administrator in the existing domain.
  • D. Ask customer’s management to discover any other uses of Google managed services, and work with the existing Super Administrator.

Correct Answer: C


Question 10

An application running on a Google Compute Engine instance needs to read data from a Google Cloud Storage bucket.
Your team does not allow Google Cloud Storage buckets to be globally readable and wants to ensure the principle of least privilege.
Which option meets the requirements of your team?

  • A. Create a Google Cloud Storage ACL that allows read-only access from the Google Compute Engine instance’s IP address and allows the application to read from the bucket
  • without credentials.
  • B. Use a service account with read-only access to the Google Cloud Storage bucket, and store the credentials to the service account in the config of the application on the
  • Google Compute Engine instance.
  • C. Use a service account with read-only access to the Google Cloud Storage bucket to retrieve the credentials from the instance metadata.
  • D. Encrypt the data in the Google Cloud Storage bucket using Google Cloud KMS, and allow the application to decrypt the data with the KMS key.

Correct Answer: C


Question 11

An organization’s typical network and security review consists of analyzing application transit routes, request handling, and firewall rules.
They want to enable their developer teams to deploy new applications without the overhead of this full review.
How should you advise this organization?

  • A. Use Forseti with Firewall filters to catch any unwanted configurations in production.
  • B. Mandate use of infrastructure as code and provide static analysis in the CI/CD pipelines to enforce policies.
  • C. Route all VPC traffic through customer-managed routers to detect malicious patterns in production.
  • D. All production applications will run on-premises. Allow developers free rein in GCP as their dev and QA platforms.

Correct Answer: B


Question 12

An employer wants to track how bonus compensations have changed over time to identify employee outliers and correct earning disparities.
This task must be performed without exposing the sensitive compensation data for any individual and must be reversible to identify the outlier.
Which Google Cloud Data Loss Prevention API technique should you use to accomplish this?

  • A. Generalization
  • B. Redaction
  • C. CryptoHashConfig
  • D. CryptoReplaceFfxFpeConfig

Correct Answer: B


Question 13

How should a customer reliably deliver Stackdriver logs from GCP to their on-premises SIEM system?

  • A. Send all logs to the SIEM system via an existing protocol such as syslog.
  • B. Configure every project to export all their logs to a common Google BigQuery DataSet, which will be queried by the SIEM system.
  • C. Configure Organizational Log Sinks to export logs to a Google Cloud Pub/Sub Topic, which will be sent to the SIEM via Google Cloud Dataflow.
  • D. Build a connector for the SIEM to query for all logs in real time from the GCP RESTful JSON APIs.

Correct Answer: C


Question 14

In order to meet PCI DSS requirements, a customer wants to ensure that all outbound traffic is authorized.
Which two cloud offerings meet this requirement without additional compensating controls? (Choose two.)

  • A. Google App Engine
  • B. Google Cloud Functions
  • C. Google Compute Engine
  • D. Google Kubernetes Engine
  • E. Google Cloud Storage

Correct Answer: A, C

Reference contents:
PCI Data Security Standard compliance | Cloud Architecture Center | Google Cloud


Question 15

A website design company recently migrated all customer sites to Google App Engine.
Some sites are still in progress and should only be visible to customers and company employees from any location.
Which solution will restrict access to the in-progress sites?

  • A. Upload an .htaccess file containing the customer and employee user accounts to Google App Engine.
  • B. Create a Google App Engine firewall rule that allows access from the customer and employee networks and denies all other traffic.
  • C. Enable Google Cloud Identity-Aware Proxy (IAP), and allow access to a Google Group that contains the customer and employee user accounts.
  • D. Use Cloud VPN to create a VPN connection between the relevant on-premises networks and the company’s GCP Virtual Private Cloud (VPC) network.

Correct Answer: C


Question 16

A company’s application is deployed with a user-managed Service Account key.
You want to use Google-recommended practices to rotate the key.
What should you do?

  • A. Open Google Cloud Shell and run gcloud iam service-accounts enable-auto-rotate –iam-account=IAM_ACCOUNT.
  • B. Open Google Cloud Shell and run gcloud iam service-accounts keys rotate –iam-account=IAM_ACCOUNT –key=NEW_KEY.
  • C. Create a new key, and use the new key in the application. Delete the old key from the Service Account.
  • D. Create a new key, and use the new key in the application. Store the old key on the system as a backup key

Correct Answer: C

Reference contents:
Understanding service accounts | Cloud IAM Documentation | Google Cloud


Question 17

An organization is migrating from their current on-premises productivity software systems to G Suite.
Some network security controls were in place that were mandated by a regulatory body in their region for their previous on-premises system. The organization’s risk team wants to ensure that network security controls are maintained and effective in G Suite. A security architect supporting this migration has been asked to ensure that network security controls are in place as part of the new shared responsibility model between the organization and Google Cloud.
What solution would help meet the requirements?

  • A. Ensure that firewall rules are in place to meet the required controls.
  • B. Set up Google Cloud Armor to ensure that network security controls can be managed for G Suite.
  • C. Network security is a built-in solution and Google’s Cloud responsibility for SaaS products like G Suite.
  • D. Set up an array of Virtual Private Cloud (VPC) networks to control network security as mandated by the relevant regulation.

Correct Answer: B


Question 18

A customer’s company has multiple business units.
Each business unit operates independently, and each has their own engineering group. Your team wants visibility into all projects created within the company and wants to organize their Google Cloud Platform (GCP) projects based on different business units. Each business unit also requires separate sets of IAM permissions.
Which strategy should you use to meet these needs?

  • A. Create an organization node, and assign folders for each business unit.
  • B. Establish standalone projects for each business unit, using gmail.com accounts.
  • C. Assign GCP resources in a project, with a label identifying which business unit owns the resource.
  • D. Assign GCP resources in a VPC for each business unit to separate network access.

Correct Answer: A


Question 19

Your team sets up a Shared VPC Network where project co-vpc-prod is the host project.
Your team has configured the firewall rules, subnets, and VPN gateway on the host project. They need to enable Engineering Group A to attach a Google Compute Engine instance to only the 10.1.1.0/24 subnet.
What should your team grant to Engineering Group A to meet this requirement?

  • A. Compute Network User Role at the host project level.
  • B. Compute Network User Role at the subnet level.
  • C. Compute Shared VPC Admin Role at the host project level.
  • D. Compute Shared VPC Admin Role at the service project level.

Correct Answer: C

Reference contents:
Shared VPC overview | Google Cloud


Question 20

A company migrated their entire data/center to Google Cloud Platform.
It is running thousands of instances across multiple projects managed by different departments. You want to have a historical record of what was running in Google Cloud Platform at any point in time.
What should you do?

  • A. Use Resource Manager on the organization level.
  • B. Use Forseti Security to automate inventory snapshots.
  • C. Use Stackdriver to create a dashboard across all projects.
  • D. Use the Security Command Center to view all assets across the organization.

Correct Answer: C


Question 21

An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP).
The first step the organization wants to take is to migrate its current data backup and disaster recovery solutions to GCP for later analysis. The organization’s production environment will remain on-premises for an indefinite time. The organization wants a scalable and cost-efficient solution.
Which GCP solution should the organization use?

  • A. Google BigQuery using a data pipeline job with continuous updates.
  • B. Google Cloud Storage using a scheduled task and gsutil.
  • C. Google Compute Engine Virtual Machines using Persistent Disk.
  • D. Google Cloud Datastore using regularly scheduled batch upload jobs.

Correct Answer: A


Question 22

You are creating an internal Google App Engine application that needs to access a user’s Google Drive on the user’s behalf.
Your company does not want to rely on the current user’s credentials. It also wants to follow Google-recommended practices.
What should you do?

  • A. Create a new Service account, and give all application users the role of Service Account User.
  • B. Create a new Service account, and add all application users to a Google Group. Give this group the role of Service Account User.
  • C. Use a dedicated G Suite Admin account, and authenticate the application’s operations with these G Suite credentials.
  • D. Create a new service account, and grant it G Suite domain-wide delegation. Have the application use it to impersonate the user.

Correct Answer: A


Question 23

A customer wants to move their sensitive workloads to a Google Compute Engine-based cluster using Managed Instance Groups (MIGs).
The jobs are bursty and must be completed quickly. They have a requirement to be able to control the key lifecycle.
Which boot disk encryption solution should you use on the cluster to meet this customer’s requirements?

  • A. Customer-supplied encryption keys (CSEK).
  • B. Customer-managed encryption keys (CMEK) using Google Cloud Key Management Service (KMS).
  • C. Encryption by default.
  • D. Pre-encrypting files before transferring to Google Cloud Platform (GCP) for analysis.

Correct Answer: B

Reference contents:
Using customer-managed encryption keys (CMEK) | Google Cloud


Question 24

You are a member of the security team at an organization.
Your team has a single GCP project with credit card payment processing systems alongside web applications and data processing systems. You want to reduce the scope of systems subject to PCI audit standards.
What should you do?

  • A. Use multi-factor authentication for admin access to the web application.
  • B. Use only applications certified compliant with PA-DSS.
  • C. Move the cardholder data environment into a separate GCP project.
  • D. Use VPN for all connections between your office and cloud environments.

Correct Answer: D

Reference contents:
PCI Data Security Standard compliance | Cloud Architecture Center | Google Cloud


Question 25

A customer’s internal security team must manage its own encryption keys for encrypting data on Google Cloud Storage and decides to use customer-supplied encryption keys (CSEK).
How should the team complete this task?

  • A. Upload the encryption key to a Google Cloud Storage bucket, and then upload the object to the same bucket.
  • B. Use the gsutil command line tool to upload the object to Google Cloud Storage, and specify the location of the encryption key.
  • C. Generate an encryption key in the Google Cloud Platform Console, and upload an object to Google Cloud Storage using the specified key.
  • D. Encrypt the object, then use the gsutil command line tool or the Google Cloud Platform Console to upload the object to Google Cloud Storage.

Correct Answer: D

Reference contents:
Customer-supplied encryption keys | Cloud Storage | Google Cloud


Question 26

A customer has 300 engineers.
The company wants to grant different levels of access and efficiently manage IAM permissions between users in the development and production environment projects.
Which two steps should the company take to meet these requirements? (Choose two.)

  • A. Create a project with multiple VPC networks for each environment.
  • B. Create a folder for each development and production environment.
  • C. Create a Google Group for the Engineering team, and assign permissions at the folder level.
  • D. Create an Organizational Policy constraint for each folder environment.
  • E. Create projects for each environment, and grant IAM rights to each engineering user.

Correct Answer: B, D


Question 27

A DevOps team will create a new container to run on Google Kubernetes Engine.
As the application will be internet-facing, they want to minimize the attack surface of the container.
What should they do?

  • A. Use Google Cloud Build to build the container images.
  • B. Build small containers using small base images.
  • C. Delete non-used versions from Container Registry.
  • D. Use a Continuous Delivery tool to deploy the application.

Correct Answer: D

Reference contents:
Best practices for building containers | Cloud Architecture Center | Google Cloud


Question 28

While migrating your organization’s infrastructure to GCP, a large number of users will need to access GCP Console.
The Identity Management team already has a well-established way to manage your users and want to keep using your existing Active Directory or LDAP server along with the existing SSO password.
What should you do?

  • A. Manually synchronize the data in Google domain with your existing Active Directory or LDAP server.
  • B. Use Google Cloud Directory Sync to synchronize the data in Google domain with your existing Active Directory or LDAP server.
  • C. Users sign in directly to the GCP Console using the credentials from your on-premises Kerberos compliant identity provider.
  • D. Users sign in using OpenID (OIDC) compatible IdP, receive an authentication token, then use that token to log in to the GCP Console.

Correct Answer: B

Reference contents:
Using your existing identity management system with Google Cloud Platform | Google Cloud


Question 29

Your company is using GSuite and has developed an application meant for internal usage on Google Google App Engine.
You need to make sure that an external user cannot gain access to the application even when an employee’s password has been compromised.
What should you do?

  • A. Enforce 2-factor authentication in GSuite for all users.
  • B. Configure Google Cloud Identity-Aware Proxy for the Google App Engine Application.
  • C. Provision user passwords using GSuite Password Sync.
  • D. Configure Cloud VPN between your private network and GCP.

Correct Answer: D


Question 30

A large financial institution is moving its Big Data analytics to Google Cloud Platform.
They want to have maximum control over the encryption process of data stored at rest in Google BigQuery.
What technique should the institution use?

  • A. Use Google Cloud Storage as a federated Data Source.
  • B. Use a Google Cloud Hardware Security Module (Cloud HSM).
  • C. Customer-managed encryption keys (CMEK).
  • D. Customer-supplied encryption keys (CSEK).

Correct Answer: C

Reference contents:
Encryption at rest | BigQuery| Google Cloud


Question 31

Applications often require access to “secrets” – small pieces of sensitive data at build or run time.
The administrator managing these secrets on GCP wants to keep a track of “who did what, where, and when?” within their GCP projects.
Which two log streams would provide the information that the administrator is looking for? (Choose two.)

  • A. Admin Activity logs
  • B. System Event logs
  • C. Data Access logs
  • D. VPC Flow logs
  • E. Agent logs

Correct Answer: A, C

Reference contents:
Secret Manager conceptual overview | Secret Manager Documentation| Google Cloud


Question 32

You are in charge of migrating a legacy application from your company data centers to GCP before the current maintenance contract expires.
You do not know what ports the application is using and no documentation is available for you to check. You want to complete the migration without putting your environment at risk.
What should you do?

  • A. Migrate the application into an isolated project using a “Lift & Shift” approach. Enable all internal TCP traffic using VPC Firewall rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.
  • B. Migrate the application into an isolated project using a “Lift & Shift” approach in a custom network. Disable all traffic within the VPC and look at the Firewall logs to determine what traffic should be allowed for the application to work properly.
  • C. Refactor the application into a micro-services architecture in a GKE cluster. Disable all traffic from outside the cluster using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.
  • D. Refactor the application into a micro-services architecture hosted in Google Cloud Functions in an isolated project. Disable all traffic from outside your project using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.

Correct Answer: C


Question 33

You want to limit the images that can be used as the source for boot disks.
These images will be stored in a dedicated project.
What should you do?

  • A. Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted project as the whitelist in an allowed operation.
  • B. Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted projects as the exceptions in a deny operation.
  • C. In Resource Manager, edit the project permissions for the trusted project. Add the organization as a member with the role: Compute Image User.
  • D. In Resource Manager, edit the organization permissions. Add the project ID as a member with the role: Compute Image User.

Correct Answer: B

Reference contents:
Setting up trusted image policies | Compute Engine Documentation| Google Cloud


Question 34

Your team needs to prevent users from creating projects in the organization.
Only the DevOps team should be allowed to create projects on behalf of the requester.
Which two tasks should your team perform to handle this request? (Choose two.)

  • A. Remove all users from the Project Creator role at the organizational level.
  • B. Create an Organization Policy constraint, and apply it at the organizational level.
  • C. Grant the Project Editor role at the organizational level to a designated group of users.
  • D. Add a designated group of users to the Project Creator role at the organizational level.
  • E. Grant the billing account creator role to the designated DevOps team.

Correct Answer: B, D


Question 35

Your team needs to make sure that their backend database can only be accessed by the frontend application and no other instances on the network.
How should your team design this network?

  • A. Create an ingress firewall rule to allow access only from the application to the database using firewall tags.
  • B. Create a different subnet for the frontend application and database to ensure network isolation.
  • C. Create two VPC networks, and connect the two networks using Cloud VPN gateways to ensure network isolation.
  • D. Create two VPC networks, and connect the two networks using VPC peering to ensure network isolation.

Correct Answer: A


Question 36

Your team wants to make sure Google Compute Engine instances running in your production project do not have public IP addresses.
The frontend application Google Compute Engine instances will require public IPs. The product engineers have the Editor role to modify resources. Your team wants to enforce this requirement.
How should your team meet these requirements?

  • A. Enable Private Access on the VPC network in the production project.
  • B. Remove the Editor role and grant the Compute Admin IAM role to the engineers.
  • C. Set up an organization policy to only permit public IPs for the front-end Google Compute Engine instances.
  • D. Set up a VPC network with two subnets: one with public IPs and one without public IPs.

Correct Answer: C

Reference contents:
Reserving a static external IP address | Compute Engine Documentation| Google Cloud


Question 37

Which two security characteristics are related to the use of VPC peering to connect two VPC networks? (Choose two.)

  • A. Central management of routes, firewalls, and VPNs for peered networks.
  • B. Non-transitive peered networks; where only directly peered networks can communicate.
  • C. Ability to peer networks that belong to different Google Cloud Platform organizations.
  • D. Firewall rules that can be created with a tag from one peered network to another peered network.
  • E. Ability to share specific subnets across peered networks.

Correct Answer: A, D


Question 38

A patch for a vulnerability has been released, and a DevOps team needs to update their running containers in Google Kubernetes Engine (GKE).
How should the DevOps team accomplish this?

  • A. Use Puppet or Chef to push out the patch to the running container.
  • B. Verify that auto upgrade is enabled; if so, Google will upgrade the nodes in a GKE cluster.
  • C. Update the application code or apply a patch, build a new image, and redeploy it.
  • D. Configure containers to automatically upgrade when the base image is available in Container Registry.

Correct Answer: B

Reference contents:
Security bulletins | Anthos clusters| Google Cloud


Question 39

A company is running their webshop on Google Kubernetes Engine and wants to analyze customer transactions in Google BigQuery.
You need to ensure that no credit card numbers are stored in Google BigQuery.
What should you do?

  • A. Create a Google BigQuery view with regular expressions matching credit card numbers to query and delete affected rows.
  • B. Use the Google Cloud Data Loss Prevention API to redact related infoTypes before data is ingested into Google BigQuery.
  • C. Leverage Security Command Center to scan for the assets of type Credit Card Number in Google BigQuery.
  • D. Enable Google Cloud Identity-Aware Proxy to filter out credit card numbers before storing the logs in Google BigQuery.

Correct Answer: D


Question 40

A manager wants to start retaining security event logs for 2 years while minimizing costs.
You write a filter to select the appropriate log entries.
Where should you export the logs?

  • A. Google BigQuery datasets
  • B. Google Cloud Storage buckets
  • C. StackDriver logging
  • D. Google Cloud Pub/Sub topics

Correct Answer: C

Reference contents:
Logs exclusions | Cloud Logging| Google Cloud


Question 41

In an effort for your company messaging app to comply with FIPS 140-2, a decision was made to use GCP compute and network services.
The messaging app architecture includes a Managed Instance Group (MIG) that controls a cluster of Google Compute Engine instances. The instances use Local SSDs for data caching and UDP for instance-to-instance communications. The app development team is willing to make any changes necessary to comply with the standard.
Which options should you recommend to meet the requirements?

  • A. Encrypt all cache storage and VM-to-VM communication using the BoringCrypto module.
  • B. Set Disk Encryption on the Instance Template used by the MIG to customer-managed key and use BoringSSL for all data transit between instances.
  • C. Change the app instance-to-instance communications from UDP to TCP and enable BoringSSL on clients’ TLS connections.
  • D. Set Disk Encryption on the Instance Template used by the MIG to Google-managed Key and use the BoringSSL library on all instance-to-instance communications.

Correct Answer: D


Question 42

A company is backing up application logs to a Google Cloud Storage bucket shared with both analysts and the administrator.
Analysts should only have access to logs that do not contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible by the administrator.
What should you do?

  • A. Use Google Cloud Pub/Sub and Google Cloud Functions to trigger a Google Cloud Data Loss Prevention scan every time a file is uploaded to the shared bucket. If the scan detects PII, have the function move into a Google Cloud Storage bucket only accessible by the administrator.
  • B. Upload the logs to both the shared bucket and the bucket only accessible by the administrator. Create a job trigger using the Google Cloud Data Loss Prevention API. Configure the trigger to delete any files from the shared bucket that contain PII.
  • C. On the bucket shared with both the analysts and the administrator, configure Object Lifecycle Management to delete objects that contain any PII.
  • D. On the bucket shared with both the analysts and the administrator, configure a Google Cloud Storage Trigger that is only triggered when PII data is uploaded. Use Google Cloud Functions to capture the trigger and delete such files.

Correct Answer: C


Question 43

A customer terminates an engineer and needs to make sure the engineer’s Google account is automatically deprovisioned.
What should the customer do?

  • A. Use the Google Cloud SDK with their directory service to remove their IAM permissions in Google Cloud Identity.
  • B. Use the Google Cloud SDK with their directory service to provision and deprovision users from Google Cloud Identity.
  • C. Configure Google Cloud Directory Sync with their directory service to provision and deprovision users from Google Cloud Identity.
  • D. Configure Google Cloud Directory Sync with their directory service to remove their IAM permissions in Google Cloud Identity.

Correct Answer: C


Question 44

An organization is evaluating the use of Google Cloud Platform (GCP) for certain IT workloads.
A well-established directory service is used to manage user identities and lifecycle management. This directory service must continue for the organization to use as the “source of truth” directory for identities.
Which solution meets the organization’s requirements?

  • A. Google Cloud Directory Sync (GCDS)
  • B. Google Google Cloud Identity
  • C. Security Assertion Markup Language (SAML)
  • D. Google Google Cloud Pub/Sub

Correct Answer: B

Reference contents:
Federating Google Cloud with Active Directory| Google Cloud


Question 45

Which international compliance standard provides guidelines for information security controls applicable to the provision and use of cloud services?

  • A. ISO 27001
  • B. ISO 27002
  • C. ISO 27017
  • D. ISO 27018

Correct Answer: C

Create a new Service Account that should be able to list the Google Compute Engine instances in the project. You want to follow Google-recommended practices.


Question 46

What are the steps to encrypt data using envelope encryption?

  • A.
    • Generate a data encryption key (DEK) locally.
    • Use a key encryption key (KEK) to wrap the DEK.
    • Encrypt data with the KEK.
    • Store the encrypted data and the wrapped KEK.
  • B.
    • Generate a key encryption key (KEK) locally.
    • Use the KEK to generate a data encryption key (DEK).
    • Encrypt data with the DEK.
    • Store the encrypted data and the wrapped DEK.
  • C.
    • Generate a data encryption key (DEK) locally.
    • Encrypt data with the DEK.
    • Use a key encryption key (KEK) to wrap the DEK.
  • Store the encrypted data and the wrapped DEK.
  • D.
    • Generate a key encryption key (KEK) locally.
    • Generate a data encryption key (DEK) locally.
    • Encrypt data with the KEK.
    • Store the encrypted data and the wrapped DEK.

Correct Answer: C

Reference contents:
Envelope encryption | Cloud KMS Documentation| Google Cloud


Question 47

An organization’s security and risk management teams are concerned about where their responsibility lies for certain production workloads they are running in Google Cloud Platform (GCP), and where Google’s responsibility lies.
They are mostly running workloads using Google Cloud’s Platform-as-a-Service (PaaS) offerings, including Google App Engine primarily.
Which one of these areas in the technology stack would they need to focus on as their primary responsibility when using Google App Engine?

  • A. Configuring and monitoring VPC Flow Logs.
  • B. Defending against XSS and SQLi attacks.
  • C. Manage the latest updates and security patches for the Guest OS.
  • D. Encrypting all stored data.

Correct Answer: D


Question 48

An engineering team is launching a web application that will be public on the internet.
The web application is hosted in multiple GCP regions and will be directed to the respective backend based on the URL request. Your team wants to avoid exposing the application directly on the internet and wants to deny traffic from a specific list of malicious IP addresses.
Which solution should your team implement to meet these requirements?

  • A. Google Cloud Armor
  • B. Network Load Balancing
  • C. SSL Proxy Load Balancing
  • D. NAT Gateway

Correct Answer: A

Reference contents:
Security policy overview| Google Cloud


Question 49

A customer is running an analytics workload on Google Cloud Platform (GCP) where Google Compute Engine instances are accessing data stored on Google Cloud Storage. Your team wants to make sure that this workload will not be able to access, or be accessed from, the internet.
Which two strategies should your team use to meet these requirements? (Choose two.)

  • A. Configure Private Google Access on the Google Compute Engine subnet
  • B. Avoid assigning public IP addresses to the Google Compute Engine cluster.
  • C. Make sure that the Google Compute Engine cluster is running on a separate subnet.
  • D. Turn off IP forwarding on the Google Compute Engine instances in the cluster.
  • E. Configure a Google Cloud NAT gateway.

Correct Answer: B, E


Question 50

You are on your company’s development team.
You noticed that your web application hosted in staging on GKE dynamically includes user data in web pages without first properly validating the inputted data. This could allow an attacker to execute gibberish commands and display arbitrary content in a victim user’s browser in a production environment.
How should you prevent and fix this vulnerability?

  • A. Use Google Cloud IAP based on IP address or end-user device attributes to prevent and fix the vulnerability.
  • B. Set up an HTTPS load balancer, and then use Google Cloud Armor for the production environment to prevent the potential XSS attack.
  • C. Use Web Security Scanner to validate the usage of an outdated library in the code, and then use a secured version of the included library.
  • D. Use Web Security Scanner in staging to simulate an XSS injection attack, and then use a templating system that supports contextual auto-escaping.

Correct Answer: D

Reference contents:
Security Command Center documentation| Google Cloud


Question 51

You are responsible for protecting highly sensitive data in Google BigQuery.
Your operations teams need access to this data, but given privacy regulations, you want to ensure that they cannot read the sensitive fields such as email addresses and first names. These specific sensitive fields should only be available on a need-to know basis to the HR team.
What should you do?

  • A. Perform data masking with the Google Cloud DLP API and store that data in Google BigQuery for later use.
  • B. Perform data redaction with the Google Cloud DLP API and store that data in Google BigQuery for later use.
  • C. Perform data inspection with the Google Cloud DLP API and store that data in Google BigQuery for later use.
  • D. Perform tokenization for Pseudonymization with the Google Cloud DLP API and store that data in Google BigQuery for later use.

Correct Answer: C

Reference contents:
BigQuery, PII, and Cloud Data Loss Prevention (DLP): Take it to the next level with Data Catalog

Comments are closed