[GCP] Google Cloud Certified - Associate Cloud Engineer

Google Cloud Certified – Associate Cloud Engineer – Practice Exam (Question 18)


Question 1

You have successfully created a development environment in a project for an application.
This application uses Google Compute Engine and Google Cloud SQL. Now, you need to create a production environment for this application. The security team has forbidden the existence of network routes between these 2 environments, and asks you to follow Google-recommended practices.
What should you do?

  • A. Create a new project, enable the Google Compute Engine and Google Cloud SQL APIs in that project, and replicate the setup you have created in the development environment.
  • B. Create a new project, modify your existing VPC to be a Shared VPC, share that VPC with your new project, and replicate the setup you have in the development environment in that new project, in the Shared VPC.
  • C. Ask the security team to grant you the Project Editor role in an existing production project used by another division of your company. Once they grant you that role, replicate the setup you have in the development environment in that project.
  • D. Create a new production subnet in the existing VPC and a new production Google Cloud SQL instance in your existing project, and deploy your application using those resources.

Correct Answer: A


Question 2

You are the team lead of a group of 10 developers.
You provided each developer with an individual Google Cloud Project that they can use as their personal sandbox to experiment with different Google Cloud solutions. You want to be notified if any of the developers are spending above $500 per month on their sandbox environment.
What should you do?

  • A. Create a single billing account for all sandbox projects and enable Google BigQuery billing exports. Create a GoogleData Studio dashboard to plot the spending per project.
  • B. Create a separate billing account per sandbox project and enable Google BigQuery billing exports. Create a GoogleData Studio dashboard to plot the spending per billing account.
  • C. Create a budget per project and configure budget alerts on all of these budgets.
  • D. Create a single budget for all projects and configure budget alerts on this budget.

Correct Answer: C

Reference contents:
Set budgets and budget alerts | Cloud Billing


Question 3

You need to verify the assigned permissions in a custom IAM role.
What should you do?

  • A. Use the Google Cloud Console, IAM section to view the information.
  • B. Use the Google Cloud Console, API section to view the information.
  • C. Use the Google Cloud Console, Security section to view the information.
  • D. Use the “gcloud init” command to view the information.

Correct Answer: A

A is correct because this is the correct console area to view permission assigned to a custom role in a particular project.
B is not correct because ‘gcloud init’ will not provide the information required. C and D are not correct because these are not the correct areas to view this information.


Question 4

You are using Google Container Registry to centrally store your company’s container images in a separate project.
In another project, you want to create a Google Kubernetes Engine (GKE) cluster. You want to ensure that Kubernetes can download images from Google Container Registry.
What should you do?

  • A. Create a service account, and give it access to Google Cloud Storage. Create a P12 key for this service account and use it as an imagePullSecrets in Kubernetes.
  • B. When you create the GKE cluster, choose [the Allow full access to all Cloud APIs] option under [Access scopes].
  • C. In the project where the images are stored, grant the Storage Object Viewer IAM role to the service account used by the Kubernetes nodes.
  • D. Configure the ACLs on each image in Google Cloud Storage to give read-only access to the default Google Compute Engine service account.

Correct Answer: C

Reference contents:
Configuring access control | Container Registry documentation
Using Container Registry with Google Cloud Platform


Question 5

You are hosting an application on bare-metal servers in your own data center.
The application needs access to Google Cloud Storage. However, security policies prevent the servers hosting the application from having public IP addresses or access to the internet. You want to follow Google- recommended practices to provide the application with access to Google Cloud Storage.
What should you do?

  • A.
    • 1. Use nslookup to get the IP address for storage.googleapis.com.
    • 2. Negotiate with the security team to be able to give a public IP address to the servers.
    • 3. Only allow egress traffic from those servers to the IP addresses for storage.googleapis.com.
  • B.
    • 1. Using Google Cloud VPN or Interconnect, create a tunnel to a VPC in GCP.
    • 2. Use Google Cloud Router to create a custom route advertisement for 199.36.153.4/30. Announce that network to your on-premises network through the VPN tunnel.
    • 3. In your on-premises network, configure your DNS server to resolve *.googleapis.com as a CNAME to restricted.googleapis.com.
  • C.
    • 1. Use Migrate for Compute Engine to migrate those servers to Google Compute Engine.
    • 2. Create an internal load balancer (ILB) that uses storage.googleapis.com as a backend.
    • 3. Configure your new instances to use this ILB as a proxy.
  • D.
    • 1. Using Google Cloud VPN, create a VPN tunnel to a VPC in GCP.
    • 2. In this VPC, create a Google Compute Engine instance and install the Squid proxy server on this instance.
    • 3. Configure your servers to use that instance as a proxy to access Google Cloud Storage.

Correct Answer: B 

Reference contents:
Configuring Private Google Access for on-premises hosts | VPC


Question 6

You have a Google Compute Engine instance hosting a production application.
You want to receive an email if the instance consumes more than 90% of its CPU resources for more than 15 minutes. You want to use Google services.
What should you do?

  • A.
    • 1. In Stackdriver Logging, create a logs-based metric to extract the CPU usage by using this regular expression: CPU Usage: ([0-9] {1,3})%
    • 2. In Stackdriver Monitoring, create an Alerting Policy based on this metric. 
    • 3. Configure your email address in the notification channel.
  • B.
    • 1. Create a Stackdriver Workspace, and associate your GCP project with it. 
    • 2. Write a script that monitors the CPU usage and sends it as a custom metric to Stackdriver.
    • 3. Create an uptime check for the instance in Stackdriver.
  • C.
    • 1. Create a Stackdriver Workspace, and associate your GCP project with it.
    • 2. Create an Alerting Policy in Stackdriver that uses the threshold as a trigger condition.
    • 3. Configure your email address in the notification channel.
  • D.
    • 1. Create a consumer Gmail account. 
    • 2. Write a script that monitors the CPU usage.
    • 3. When the CPU usage exceeds the threshold, have that script send an email using the Gmail account and smtp.gmail.com on port 25 as SMTP server.

Correct Answer: A


Question 7

You have a project using Google BigQuery.
You want to list all Google BigQuery jobs for that project. You want to set this project as the default for the bq command-line tool.
What should you do?

  • A. Use “gcloud config set project” to set the default project.
  • B. Use “gcloud generate config-url” to generate a URL to the Google Cloud Console to set the default project.
  • C. Use “bq generate config-url” to generate a URL to the Google Cloud Console to set the default project.
  • D. Use “bq config set project” to set the default project.

Correct Answer: A

A is correct because you need to use gcloud to manage the config/defaults.
B is not correct because the bq command-line tool assumes the gcloud configuration settings and can’t be set through Google BigQuery.
C and D is not correct because entering this command will not achieve the desired result and will generate an error.

Reference contents:
Command-line tool reference | BigQuery
gcloud config set | Cloud SDK Documentation


Question 8

You want to deploy an application on Google Cloud Run that processes messages from a Google Cloud Pub/Sub topic.
You want to follow Google-recommended practices.
What should you do?

  • A.
    • 1. Create a service account.
    • 2. Give the Google Cloud Run Invoker role to that service account for your Google Cloud Run application.
    • 3. Create a Google Cloud Pub/Sub subscription that uses that service account and uses your Google Cloud Run application as the push endpoint.
  • B.
    • 1. Grant the Google Cloud Pub/Sub Subscriber role to the service account used by Google Cloud Run.
    • 2. Create a Google Cloud Pub/Sub subscription for that topic.
    • 3. Make your application pull messages from that subscription.
  • C.
    • 1. Create a Google Cloud Function that uses a Google Cloud Pub/Sub trigger on that topic.
    •  2. Call your application on Google Cloud Run from the Google Cloud Function for every message.
  • D.
    • 1. Deploy your application on Google Cloud Run on GKE with the connectivity set to Internal.
    • 2. Create a Google Cloud Pub/Sub subscription for that topic.
    • 3. In the same GKE cluster as your application, deploy a container that takes the messages and sends them to your application.

Correct Answer: A


Question 9

You are building a new version of an application hosted in an Google App Engine environment.
You want to test the new version with 1% of users before you completely switch your application over to the new version.
What should you do?

  • A. Deploy a new version of your application in a Google Compute Engine instance instead of Google App Engine and then use Google Cloud Console to split traffic.
  • B. Deploy a new version as a separate app in Google App Engine. Then configure Google App Engine using Google Cloud Console to split traffic between the two apps.
  • C. Deploy a new version of your application in Google Kubernetes Engine instead of Google App Engine and then use Google Cloud Console to split traffic.
  • D. Deploy a new version of your application in Google App Engine. Then go to Google App Engine settings in Google Cloud Console and split traffic between the current version and newly deployed versions accordingly.

Correct Answer: D

Reference contents:
Splitting Traffic | App Engine standard environment for Python 2


Question 10

You are asked to set up application performance monitoring on GCP projects A, B, and C as a single pane of glass.
You want to monitor CPU, memory, and disk.
What should you do?

  • A. Enable API and then give the metrics.reader role to GCP projects A, B, and C.
  • B. Enable API and then share charts from GCP  project A and B.
  • C. Enable API, create a workspace under GCP project A, and then add B.
  • D. Enable API and then use default dashboards to view all projects in sequence.

Correct Answer: D


Question 11

You want to verify the IAM users and roles assigned within a GCP project named my-project.
What should you do?

  • A. Run gcloud iam service-accounts list. Review the output section.
  • B. Navigate to the project and then to the Roles section in the Google Cloud Console. Review the roles and status.
  • C. Navigate to the project and then to the IAM section in the Google Cloud Console. Review the members and roles.
  • D. Run gcloud iam roles list. Review the output section.

Correct Answer: C

IAM section provides the list of both Members and Roles.
Option A is wrong as it would provide information about the roles only.
Option B is wrong as it would provide only the service accounts.
Option D is wrong as it would provide information about the roles only.


Question 12

Your organization has strict requirements to control access to GCP projects.
You need to enable your Site Reliability Engineers (SREs) to approve requests from the Google Cloud support team when an SRE opens a support case. You want to follow Google-recommended practices.
What should you do?

  • A. Add your SREs to roles/iam.roleAdmin role.
  • B. Add your SREs to a group and then add this group to roles/iam roleAdmin role.
  • C. Add your SREs to a group and then add this group to roles/access approval approver role.
  • D. Add your SREs to roles/access approval approver roles.

Correct Answer: C


Question 13

You have been asked to set up Object Lifecycle Management for objects stored in storage buckets.
The objects are written once and accessed frequently for 30 days. After 30 days, the objects are not read again unless there is a special need. The object should be kept for three years, and you need to minimize cost.
What should you do?

  • A. Set up a policy that uses Standard storage for 30 days and then moves to Archive storage for three years.
  • B. Set up a policy that uses Standard storage for 30 days, then moves to Coldline for one year, and then moves to Archive storage for two years.
  • C. Set up a policy that uses Nearline storage for 30 days, then moves the Coldline for one year, and then moves to Archive storage for two years.
  • D. Set up a policy that uses Nearline storage for 30 days and then moves to Archive storage for three years.

Correct Answer: D

Reference contents:
Object Lifecycle Management #Object lifecycle behavior | Cloud Storage


Question 14

Your company has a large quantity of unstructured data in different file formats.
You want to perform ETL transformations on the data. You need to make the data accessible on Google Cloud so it can be processed by a Google Cloud Dataflow job.
What should you do?

  • A. Upload the data to Google Cloud Storage using the gsutil command line tool.
  • B. Upload the data into Google Cloud Spanner using the import function in the console.
  • C. Upload the data into Google Cloud SQL using the import function in the console.
  • D. Upload the data to Google BigQuery using the bq command line tool.

Correct Answer: A

Reference contents:
Performing ETL from a relational database into BigQuery using Dataflow


Question 15

You need a dynamic way of provisioning VMs on GoogleCompute Engine.
The exact specifications will be in a dedicated configuration file. You want to follow Google’s recommended practices.
Which method should you use?

  • A. Managed Instance Group
  • B. Google Cloud Deployment Manager
  • C. Google Cloud Composer
  • D. Unmanaged Instance Group

Correct Answer: B

Google Cloud Deployment Manager allows you to specify all the resources needed for your application in a declarative format using yaml. You can also use Python or Jinja2 templates to parameterize the configuration and allow reuse of common deployment paradigms such as a load balanced, auto-scaled instance group. Treat your configuration as code and perform repeatable deployments.

Reference contents:
Google Cloud Deployment Manager


Question 16

You are hosting an application from Google Compute Engine virtual machines (VMs) in us-central1-a.
You want to adjust your design to support the failure of a single Google Compute Engine zone, eliminate downtime, and minimize cost.
What should you do?

  • A. Create an HTTP(S) Load Balancer. Create one or more global forwarding rules to direct traffic to your VMs.
  • B. Perform regular backups of your application. Create a Google Cloud Monitoring Alert and be notified if your application becomes unavailable. Restore from backups when notified.
  • C. Create a Managed Instance Group and specify us-central1-a as the zone. Configure the Health Check with a short Health Interval.
  • D. Create Google Compute Engine resources in us-central1-b. Balance the load across both us-central1-a and us-central1-b.

Correct Answer: C

Reference contents:
Google CloudPlatform/puppet-google-compute


Question 17

You have been asked to automate the infrastructure deployment using Google Cloud Deployment Manager service.
Which all formats do Google Cloud Deployment manager template supports?

  • A. Python
  • B. YAML
  • C. Powershell
  • D. JSON

Correct Answer: A, B

Reference contents:
Creating a Basic Template #Template syntax | Cloud Deployment Manager Documentation


Question 18

You want to configure an SSH connection to a single Google Compute Engine instance for users in the dev1 group.
This instance is the only resource in this particular GCP project that the dev1 users should be able to connect to.
What should you do?

  • A. Enable block project wide keys for the instance. Generate an SSH key for each user in the dev1 group. Distribute the keys to dev1 users and direct them to use their third-party tools to connect.
  • B. Set metadata to enable-oslogin=true for the instance. Set the service account to no service account for that instance. Direct them to use the Google Cloud Shell to ssh to that instance.
  • C. Set metadata to enable-oslogin=true for the instance. Grant the dev1 group the compute.osLoginrole. Direct them to use the Google Cloud Shell to ssh to that instance.
  • D. Enable block project wide keys for the instance. Generate an SSH key and associate the key with that instance. Distribute the key to dev1 users and direct them to use their third-party tools to connect.

Correct Answer: C

Comments are closed