0% found this document useful (0 votes)
154 views46 pages

DP 300 Demo

The document provides a detailed overview of the DP-300 exam for Azure Database Administrator Associate, including the existing and planned database environments of a fictional company, Litware. It outlines technical, security, compliance, and business requirements for database management and migration, as well as specific questions and answers related to the exam content. The document serves as a demo version with limited content and encourages users to obtain the full exam file through a provided link.

Uploaded by

harperella546
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
154 views46 pages

DP 300 Demo

The document provides a detailed overview of the DP-300 exam for Azure Database Administrator Associate, including the existing and planned database environments of a fictional company, Litware. It outlines technical, security, compliance, and business requirements for database management and migration, as well as specific questions and answers related to the exam content. The document serves as a demo version with limited content and encourages users to obtain the full exam file through a provided link.

Uploaded by

harperella546
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

Microsoft

DP-300 Exam
Azure Database Administrator Associate

Questions & Answers


(Demo Version - Limited Content)

Thank you for Downloading DP-300 exam PDF Demo

Get Full File:

https://authorizedumps.com/dp-300-exam-dumps/

www.authorizedumps.com
Questions & Answers PDF Page 2

Version:24.4

Topic 1, Litware

Existing Environment

Network Environment

The manufacturing and research datacenters connect to the primary datacenter by using a VPN.

The primary datacenter has an ExpressRoute connection that uses both Microsoft peering and
private peering. The private peering connects to an Azure virtual network named HubVNet.

Identity Environment

Litware has a hybrid Azure Active Directory (Azure AD) deployment that uses a domain named
litwareinc.com. All Azure subscriptions are associated to the litwareinc.com Azure AD tenant.

Database Environment

The sales department has the following database workload:

www.authorizedumps.com
Questions & Answers PDF Page 3

An on-premises named SERVER1 hosts an instance of Microsoft SQL Server 2012 and two 1-TB
databases.

A logical server named SalesSrv01A contains a geo-replicated Azure SQL database named
SalesSQLDb1. SalesSQLDb1 is in an elastic pool named SalesSQLDb1Pool. SalesSQLDb1 uses database
firewall rules and contained database users.

An application named SalesSQLDb1App1 uses SalesSQLDb1.

The manufacturing office contains two on-premises SQL Server 2016 servers named SERVER2 and
SERVER3. The servers are nodes in the same Always On availability group. The availability group
contains a database named ManufacturingSQLDb1

Database administrators have two Azure virtual machines in HubVnet named VM1 and VM2 that run
Windows Server 2019 and are used to manage all the Azure databases.

Licensing Agreement

Litware is a Microsoft Volume Licensing customer that has License Mobility through Software
Assurance.

Current Problems

SalesSQLDb1 experiences performance issues that are likely due to out-of-date statistics and
frequent blocking queries.

Requirements

Planned Changes

www.authorizedumps.com
Questions & Answers PDF Page 4

Litware plans to implement the following changes:

Implement 30 new databases in Azure, which will be used by time-sensitive manufacturing apps that
have varying usage patterns. Each database will be approximately 20 GB.

Create a new Azure SQL database named ResearchDB1 on a logical server named ResearchSrv01.
ResearchDB1 will contain Personally Identifiable Information (PII) data.

Develop an app named ResearchApp1 that will be used by the research department to populate and
access ResearchDB1.

Migrate ManufacturingSQLDb1 to the Azure virtual machine platform.

Migrate the SERVER1 databases to the Azure SQL Database platform.

Technical Requirements

Litware identifies the following technical requirements:

Maintenance tasks must be automated.

The 30 new databases must scale automatically.

The use of an on-premises infrastructure must be minimized.

Azure Hybrid Use Benefits must be leveraged for Azure SQL Database deployments.

All SQL Server and Azure SQL Database metrics related to CPU and storage usage and limits must be
analyzed by using Azure built-in functionality.

Security and Compliance Requirements

Litware identifies the following security and compliance requirements:

Store encryption keys in Azure Key Vault.

Retain backups of the PII data for two months.

www.authorizedumps.com
Questions & Answers PDF Page 5

Encrypt the PII data at rest, in transit, and in use.

Use the principle of least privilege whenever possible.

Authenticate database users by using Active Directory credentials.

Protect Azure SQL Database instances by using database-level firewall rules.

Ensure that all databases hosted in Azure are accessible from VM1 and VM2 without relying on
public endpoints.

Business Requirements

Litware identifies the following business requirements:

Meet an SLA of 99.99% availability for all Azure deployments.

Minimize downtime during the migration of the SERVER1 databases.

Use the Azure Hybrid Use Benefits when migrating workloads to Azure.

Once all requirements are met, minimize costs whenever possible.

Question: 1

HOTSPOT

You are planning the migration of the SERVER1 databases. The solution must meet the business
requirements.

What should you include in the migration plan? To answer, select the appropriate options in the
answer area.

NOTE: Each correct selection is worth one point.

www.authorizedumps.com
Questions & Answers PDF Page 6

Answer:
Explanation:

Azure Database Migration service

Box 1: Premium 4-VCore

Scenario: Migrate the SERVER1 databases to the Azure SQL Database platform.

Minimize downtime during the migration of the SERVER1 databases.

Premimum 4-vCore is for large or business critical workloads. It supports online migrations, ofline
migrations, and faster migration speeds.

Incorrect Answers:

The Standard pricing tier suits most small- to medium- business workloads, but it supports ofline
migration only.

www.authorizedumps.com
Questions & Answers PDF Page 7

Box 2: A VPN gateway

You need to create a Microsoft Azure Virtual Network for the Azure Database Migration Service by
using the Azure Resource Manager deployment model, which provides site-to-site connectivity to
your on-premises source servers by using either ExpressRoute or VPN.

Reference:

https://azure.microsoft.com/pricing/details/database-migration/

https://docs.microsoft.com/en-us/azure/dms/tutorial-sql-server-azure-sql-online

Question: 2
DRAG DROP

You need to configure user authentication for the SERVER1 databases. The solution must meet the
security and compliance requirements.

Which three actions should you perform in sequence? To answer, move the appropriate actions from
the list of actions to the answer area and arrange them in the correct order.

www.authorizedumps.com
Questions & Answers PDF Page 8

Answer:
Explanation:

Scenario: Authenticate database users by using Active Directory credentials.

The configuration steps include the following procedures to configure and use Azure Active Directory
authentication.

Create and populate Azure AD.

Optional: Associate or change the active directory that is currently associated with your Azure
Subscription.

www.authorizedumps.com
Questions & Answers PDF Page 9

Create an Azure Active Directory administrator. (Step 1)

Configure your client computers.

Create contained database users in your database mapped to Azure AD identities. (Step 2)

Connect to your database by using Azure AD identities. (Step 3)

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-overview

Question: 3
HOTSPOT

You need to implement the monitoring of SalesSQLDb1. The solution must meet the technical
requirements.

How should you collect and stream metrics? To answer, select the appropriate options in the answer
area.

NOTE: Each correct selection is worth one point.

www.authorizedumps.com
Questions & Answers PDF Page 10

Answer:
Explanation:

Box 1: The server, the elastic pool, and the database

Senario:

SalesSQLDb1 is in an elastic pool named SalesSQLDb1Pool.

Litware technical requirements include: all SQL Server and Azure SQL Database metrics related to
CPU and storage usage and limits must be analyzed by using Azure built-in functionality.

Box 2: Azure Event hubs

Scenario: Migrate ManufacturingSQLDb1 to the Azure virtual machine platform.

Event hubs are able to handle custom metrics.

Incorrect Answers:

Azure Log Analytics

Azure metric and log data are sent to Azure Monitor Logs, previously known as Azure Log Analytics,
directly by Azure. Azure SQL Analytics is a cloud only monitoring solution supporting streaming of
diagnostics telemetry for all of your Azure SQL databases.

www.authorizedumps.com
Questions & Answers PDF Page 11

However, because Azure SQL Analytics does not use agents to connect to Azure Monitor, it does not
support monitoring of SQL Server hosted on-premises or in virtual machines.

Question: 4

You need to identify the cause of the performance issues on SalesSQLDb1.

Which two dynamic management views should you use? Each correct answer presents part of the
solution.

NOTE: Each correct selection is worth one point.

A. sys.dm_pdw_nodes_tran_locks

B. sys.dm_exec_compute_node_errors

C. sys.dm_exec_requests

D. sys.dm_cdc_errors

E. sys.dm_pdw_nodes_os_wait_stats

F. sys.dm_tran_locks

Answer: AE
Explanation:

SalesSQLDb1 experiences performance issues that are likely due to out-of-date statistics and
frequent blocking queries.

A: Use sys.dm_pdw_nodes_tran_locks instead of sys.dm_tran_locks from Azure Synapse Analytics


(SQL Data Warehouse) or Parallel Data Warehouse.

E: Example:

The following query will show blocking information.

www.authorizedumps.com
Questions & Answers PDF Page 12

SELECT

t1.resource_type,

t1.resource_database_id,

t1.resource_associated_entity_id,

t1.request_mode,

t1.request_session_id,

t2.blocking_session_id

FROM sys.dm_tran_locks as t1

INNER JOIN sys.dm_os_waiting_tasks as t2

ON t1.lock_owner_address = t2.resource_address;

Note: Depending on the system you’re working with you can access these wait statistics from one of
three locations:

sys.dm_os_wait_stats: for SQL Server

sys.dm_db_wait_stats: for Azure SQL Database

sys.dm_pdw_nodes_os_wait_stats: for Azure SQL Data Warehouse

Incorrect Answers:

F: sys.dm_tran_locks returns information about currently active lock manager resources in SQL
Server 2019 (15.x). Each row represents a currently active request to the lock manager for a lock that
has been granted or is waiting to be granted.

Instead use sys.dm_pdw_nodes_tran_locks.

Reference:

https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-
views/sys-dm-tran-locks-transact-sql

www.authorizedumps.com
Questions & Answers PDF Page 13

Question: 5

HOTSPOT

You need to recommend a configuration for ManufacturingSQLDb1 after the migration to Azure. The
solution must meet the business requirements.

What should you include in the recommendation? To answer, select the appropriate options in the
answer area.

NOTE: Each correct selection is worth one point.

Answer:
Explanation:

www.authorizedumps.com
Questions & Answers PDF Page 14

Scenario: Business Requirements

Litware identifies business requirements include: meet an SLA of 99.99% availability for all Azure
deployments.

Box 1: Cloud witness

If you have a Failover Cluster deployment, where all nodes can reach the internet (by extension of
Azure), it is recommended that you configure a Cloud Witness as your quorum witness resource.

Box 2: Azure Basic Load Balancer

Microsoft guarantees that a Load Balanced Endpoint using Azure Standard Load Balancer, serving two
or more Healthy Virtual Machine Instances, will be available 99.99% of the time.

Note: There are two main options for setting up your listener: external (public) or internal. The
external (public) listener uses an internet facing load balancer and is associated with a public Virtual
IP (VIP) that is accessible over the internet. An internal listener uses an internal load balancer and
only supports clients within the same Virtual Network.

Reference:

https://technet.microsoft.com/windows-server-docs/failover-clustering/deploy-cloud-witness

https://azure.microsoft.com/en-us/support/legal/sla/load-balancer/v1_0/

www.authorizedumps.com
Questions & Answers PDF Page 15

Question: 6

You need to implement authentication for ResearchDB1. The solution must meet the security and
compliance requirements.

What should you run as part of the implementation?

A. CREATE LOGIN and the FROM WINDOWS clause

B. CREATE USER and the FROM CERTIFICATE clause

C. CREATE USER and the FROM LOGIN clause

D. CREATE USER and the ASYMMETRIC KEY clause

E. CREATE USER and the FROM EXTERNAL PROVIDER clause

Answer: E
Explanation:

Scenario: Authenticate database users by using Active Directory credentials.

(Create a new Azure SQL database named ResearchDB1 on a logical server named ResearchSrv01.)

Authenticate the user in SQL Database or SQL Data Warehouse based on an Azure Active Directory
user:

CREATE USER [Fritz@contoso.com] FROM EXTERNAL PROVIDER;

Reference:

https://docs.microsoft.com/en-us/sql/t-sql/statements/create-user-transact-sql

Question: 7

www.authorizedumps.com
Questions & Answers PDF Page 16

HOTSPOT

You need to recommend the appropriate purchasing model and deployment option for the 30 new
databases. The solution must meet the technical requirements and the business requirements.

What should you recommend? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Answer:
Explanation:

www.authorizedumps.com
Questions & Answers PDF Page 17

Box 1: DTU

Scenario:

The 30 new databases must scale automatically.

Once all requirements are met, minimize costs whenever possible.

You can configure resources for the pool based either on the DTU-based purchasing model or the
vCore-based purchasing model.

In short, for simplicity, the DTU model has an advantage. Plus, if you’re just getting started with Azure
SQL Database, the DTU model offers more options at the lower end of performance, so you can get
started at a lower price point than with vCore.

Box 2: An Azure SQL database elastic pool

Azure SQL Database elastic pools are a simple, cost-effective solution for managing and scaling
multiple databases that have varying and unpredictable usage demands. The databases in an elastic
pool are on a single server and share a set number of resources at a set price. Elastic pools in Azure
SQL Database enable SaaS developers to optimize the price performance for a group of databases
within a prescribed budget while delivering performance elasticity for each database.

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-pool-overview

www.authorizedumps.com
Questions & Answers PDF Page 18

https://docs.microsoft.com/en-us/azure/azure-sql/database/reserved-capacity-overview

Question: 8
DRAG DROP

You need to implement statistics maintenance for SalesSQLDb1. The solution must meet the
technical requirements.

Which four actions should you perform in sequence? To answer, move the appropriate actions from
the list of actions to the answer area and arrange them in the correct order.

www.authorizedumps.com
Questions & Answers PDF Page 19

Answer:
Explanation:

Automating Azure SQL DB index and statistics maintenance using Azure Automation:

1. Create Azure automation account (Step 1)

2. Import SQLServer module (Step 2)

3. Add Credentials to access SQL DB

This will use secure way to hold login name and password that will be used to access Azure SQL DB

4. Add a runbook to run the maintenance (Step 3)

Steps:
1. Click on "runbooks" at the left panel and then click "add a runbook"

2. Choose "create a new runbook" and then give it a name and choose "Powershell" as the type of
the runbook and then click on "create"

www.authorizedumps.com
Questions & Answers PDF Page 20

5. Schedule task (Step 4)

Steps:
1. Click on Schedules
2. Click on "Add a schedule" and follow the instructions to choose existing schedule or create a new
schedule.

Reference:

https://techcommunity.microsoft.com/t5/azure-database-support-blog/automating-azure-sql-db-
index-and-statistics-maintenance-using/ba-p/368974

Question: 9

What should you do after a failover of SalesSQLDb1 to ensure that the database remains accessible
to SalesSQLDb1App1?

A. Configure SalesSQLDb1 as writable.

B. Update the connection strings of SalesSQLDb1App1.

C. Update the firewall rules of SalesSQLDb1.

D. Update the users in SalesSQLDb1.

www.authorizedumps.com
Questions & Answers PDF Page 21

Answer: B
Explanation:

Scenario: SalesSQLDb1 uses database firewall rules and contained database users.

Question: 10

DRAG DROP

You create all of the tables and views for ResearchDB1.

You need to implement security for ResearchDB1. The solution must meet the security and
compliance requirements.

Which three actions should you perform in sequence? To answer, move the appropriate actions from
the list of actions to the answer area and arrange them in the correct order.

www.authorizedumps.com
Questions & Answers PDF Page 22

Answer:
Explanation:

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/always-encrypted-azure-key-vault-
configure?tabs=azure-powershell

Question: 11

You need to recommend a solution to ensure that the customers can create the database objects.
The solution must meet the business goals.

What should you include in the recommendation?

A. For each customer, grant the customer ddl_admin to the existing schema.

B. For each customer, create an additional schema and grant the customer ddl_admin to the new
schema.

C. For each customer, create an additional schema and grant the customer db_writerto the new
schema.

D. For each customer, grant the customer db_writerto the existing schema.

www.authorizedumps.com
Questions & Answers PDF Page 23

Answer: B
Explanation:

Question: 12

You are evaluating the business goals.

Which feature should you use to provide customers with the required level of access based on their
service agreement?

A. dynamic data masking

B. Conditional Access in Azure

C. service principals

D. row-level security (RLS)

Answer: D
Explanation:

Reference:

https://docs.microsoft.com/en-us/sql/relational-databases/security/row-level-security?view=sql-
server-ver15

Question: 13

You need to provide an implementation plan to configure data retention for ResearchDB1. The
solution must meet the security and compliance requirements.

www.authorizedumps.com
Questions & Answers PDF Page 24

What should you include in the plan?

A. Configure the Deleted databases settings for ResearchSrvOL

B. Deploy and configure an Azure Backup server.

C. Configure the Advanced Data Security settings for ResearchDBL

D. Configure the Manage Backups settings for ResearchSrvOL

Answer: D
Explanation:

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/long-term-backup-retention-configure

Topic 2, Contoso Ltd

Case study

Overview

This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.

www.authorizedumps.com
Questions & Answers PDF Page 25

At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the
information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.

Overview

Existing Environment

Contoso, Ltd. is a financial data company that has 100 employees. The company delivers financial
data to customers.

Active Directory

Contoso has a hybrid Azure Active Directory (Azure AD) deployment that syncs to on-premises Active
Directory.

Database Environment

Contoso has SQL Server 2017 on Azure virtual machines shown in the following table.

www.authorizedumps.com
Questions & Answers PDF Page 26

SQL1 and SQL2 are in an Always On availability group and are actively queried. SQL3 runs jobs,
provides historical data, and handles the delivery of data to customers.

The on-premises datacenter contains a PostgreSQL server that has a 50-TB database.

Current Business Model

Contoso uses Microsoft SQL Server Integration Services (SSIS) to create flat files for customers. The
customers receive the files by using FTP.

Requirements

Planned Changes

Contoso plans to move to a model in which they deliver data to customer databases that run as
platform as a service (PaaS) offerings. When a customer establishes a service agreement with
Contoso, a separate resource group that contains an Azure SQL database will be provisioned for the
customer. The database will have a complete copy of the financial data. The data to which each
customer will have access will depend on the service agreement tier. The customers can change tiers
by changing their service agreement.

The estimated size of each PaaS database is 1 TB.

Contoso plans to implement the following changes:

www.authorizedumps.com
Questions & Answers PDF Page 27

Move the PostgreSQL database to Azure Database for PostgreSQL during the next six months.

Upgrade SQL1, SQL2, and SQL3 to SQL Server 2019 during the next few months.

Start onboarding customers to the new PaaS solution within six months.

Business Goals

Contoso identifies the following business requirements:

Use built-in Azure features whenever possible.

Minimize development effort whenever possible.

Minimize the compute costs of the PaaS solutions.

Provide all the customers with their own copy of the database by using the PaaS solution.

Provide the customers with different table and row access based on the customer’s service
agreement.

In the event of an Azure regional outage, ensure that the customers can access the PaaS solution
with minimal downtime. The solution must provide automatic failover.

Ensure that users of the PaaS solution can create their own database objects but he prevented from
modifying any of the existing database objects supplied by Contoso.

Technical Requirements

Contoso identifies the following technical requirements:

Users of the PaaS solution must be able to sign in by using their own corporate Azure AD
credentials or have Azure AD credentials supplied to them by Contoso. The solution must avoid using
the internal Azure AD of Contoso to minimize guest users.

All customers must have their own resource group, Azure SQL server, and Azure SQL database. The

www.authorizedumps.com
Questions & Answers PDF Page 28

deployment of resources for each customer must be done in a consistent fashion.

Users must be able to review the queries issued against the PaaS databases and identify any new
objects created.

Downtime during the PostgreSQL database migration must be minimized.

Monitoring Requirements

Contoso identifies the following monitoring requirements:

Notify administrators when a PaaS database has a higher than average CPU usage.

Use a single dashboard to review security and audit data for all the PaaS databases.

Use a single dashboard to monitor query performance and bottlenecks across all the PaaS
databases.

Monitor the PaaS databases to identify poorly performing queries and resolve query performance
issues automatically whenever possible.

PaaS Prototype

During prototyping of the PaaS solution in Azure, you record the compute utilization of a customer’s
Azure SQL database as shown in the following exhibit.

www.authorizedumps.com
Questions & Answers PDF Page 29

Role Assignments

For each customer’s Azure SQL Database server, you plan to assign the roles shown in the following
exhibit.

www.authorizedumps.com
Questions & Answers PDF Page 30

Question: 14

HOTSPOT

You are evaluating the role assignments.

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

NOTE: Each correct selection is worth one point.

Answer:
Explanation:

Box 1: Yes

www.authorizedumps.com
Questions & Answers PDF Page 31

DBAGroup1 is member of the Contributor role.

The Contributor role grants full access to manage all resources, but does not allow you to assign roles
in Azure RBAC, manage assignments in Azure Blueprints, or share image galleries.

Box 2: No

Box 3: Yes

DBAGroup2 is member of the SQL DB Contributor role.

The SQL DB Contributor role lets you manage SQL databases, but not access to them. Also, you can't
manage their security-related policies or their parent SQL servers. As a member of this role you can
create and manage SQL databases.

Reference:

https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles

Question: 15

Based on the PaaS prototype, which Azure SQL Database compute tier should you use?

A. Business Critical 4-vCore

B. Hyperscale

C. General Purpose v-vCore

D. Serverless

Answer: A
Explanation:

There are CPU and Data I/O spikes for the PaaS prototype. Business Critical 4-vCore is needed.

www.authorizedumps.com
Questions & Answers PDF Page 32

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/reserved-capacity-overview

Question: 16

Which audit log destination should you use to meet the monitoring requirements?

A. Azure Storage

B. Azure Event Hubs

C. Azure Log Analytics

Answer: C
Explanation:

Scenario: Use a single dashboard to review security and audit data for all the PaaS databases.

With dashboards can bring together operational data that is most important to IT across all your
Azure resources, including telemetry from Azure Log Analytics.

Note: Auditing for Azure SQL Database and Azure Synapse Analytics tracks database events and
writes them to an audit log in your Azure storage account, Log Analytics workspace, or Event Hubs.

Reference:

https://docs.microsoft.com/en-us/azure/azure-monitor/visualize/tutorial-logs-dashboards

www.authorizedumps.com
Questions & Answers PDF Page 33

Question: 17

What should you implement to meet the disaster recovery requirements for the PaaS solution?

A. Availability Zones

B. failover groups

C. Always On availability groups

D. geo-replication

Answer: B
Explanation:

Scenario: In the event of an Azure regional outage, ensure that the customers can access the PaaS
solution with minimal downtime. The solution must provide automatic failover.

The auto-failover groups feature allows you to manage the replication and failover of a group of
databases on a server or all databases in a managed instance to another region. It is a declarative
abstraction on top of the existing active geo-replication feature, designed to simplify deployment
and management of geo-replicated databases at scale. You can initiate failover manually or you can
delegate it to the Azure service based on a user-defined policy.

The latter option allows you to automatically recover multiple related databases in a secondary
region after a catastrophic failure or other unplanned event that results in full or partial loss of the
SQL Database or SQL Managed Instance availability in the primary region.

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/auto-failover-group-overview

Question: 18

www.authorizedumps.com
Questions & Answers PDF Page 34

What should you use to migrate the PostgreSQL database?

A. Azure Data Box

B. AzCopy

C. Azure Database Migration Service

D. Azure Site Recovery

Answer: C
Explanation:

Reference:

https://docs.microsoft.com/en-us/azure/dms/dms-overview

Question: 19

You need to implement a solution to notify the administrators. The solution must meet the
monitoring requirements.

What should you do?

A. Create an Azure Monitor alert rule that has a static threshold and assign the alert rule to an
action group.

B. Add a diagnostic setting that logs QueryStoreRuntimeStatistics and streams to an Azure event
hub.

C. Add a diagnostic setting that logs Timeouts and streams to an Azure event hub.

D. Create an Azure Monitor alert rule that has a dynamic threshold and assign the alert rule to an
action group.

www.authorizedumps.com
Questions & Answers PDF Page 35

Answer: D
Explanation:

Reference:

https://azure.microsoft.com/en-gb/blog/announcing-azure-monitor-aiops-alerts-with-dynamic-
thresholds/

Topic 3, ADatum Corporation

Overview

This is a case study. Case studies are not timed separately. You can use as much exam time as you
would like to complete each case. However, there may be additional case studies and sections on
this exam. You must manage your time to ensure that you are able to complete all questions included
on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is
provided in the case study. Case studies might contain exhibits and other resources that provide
more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your
answers and to make changes before you move to the next section of the exam. After you begin a
new section, you cannot return to this section.

To start the case study

To display the first question in this case study, click the Next button. Use the buttons in the left pane
to explore the content of the case study before you answer the questions. Clicking these buttons
displays information such as business requirements, existing environment, and problem statements.
If the case study has an All Information tab, note that the information displayed is identical to the

www.authorizedumps.com
Questions & Answers PDF Page 36

information displayed on the subsequent tabs. When you are ready to answer a question, click the
Question button to return to the question.

Overview

ADatum Corporation is a retailer that sells products through two sales channels: retail stores and a
website.

Existing Environment

ADatum has one database server that has Microsoft SQL Server 2016 installed. The server hosts three
mission-critical databases named SALESDB, DOCDB, and REPORTINGDB.

SALESDB collects data from the stores and the website.

DOCDB stores documents that connect to the sales data in SALESDB. The documents are stored in
two different JSON formats based on the sales channel.

REPORTINGDB stores reporting data and contains several columnstore indexes. A daily process
creates reporting data in REPORTINGDB from the data in SALESDB. The process is implemented as a
SQL Server Integration Services (SSIS) package that runs a stored procedure from SALESDB.

Requirements

Planned Changes

ADatum plans to move the current data infrastructure to Azure. The new infrastructure has the
following requirements:

www.authorizedumps.com
Questions & Answers PDF Page 37

Migrate SALESDB and REPORTINGDB to an Azure SQL database.

Migrate DOCDB to Azure Cosmos DB.

The sales data, including the documents in JSON format, must be gathered as it arrives and analyzed
online by using Azure Stream Analytics. The analytics process will perform aggregations that must be
done continuously, without gaps, and without overlapping.

As they arrive, all the sales documents in JSON format must be transformed into one consistent
format.

Azure Data Factory will replace the SSIS process of copying the data from SALESDB to REPORTINGDB.

Technical Requirements

The new Azure data infrastructure must meet the following technical requirements:

Data in SALESDB must encrypted by using Transparent Data Encryption (TDE). The encryption must
use your own key.

SALESDB must be restorable to any given minute within the past three weeks.

Real-time processing must be monitored to ensure that workloads are sized properly based on actual
usage patterns.

Missing indexes must be created automatically for REPORTINGDB.

Disk IO, CPU, and memory usage must be monitored for SALESDB.

Question: 20

Which windowing function should you use to perform the streaming aggregation of the sales data?

A. Sliding

B. Hopping

www.authorizedumps.com
Questions & Answers PDF Page 38

C. Session

D. Tumbling

Answer: D
Explanation:

Scenario: The sales data, including the documents in JSON format, must be gathered as it arrives and
analyzed online by using Azure Stream Analytics. The analytics process will perform aggregations
that must be done continuously, without gaps, and without overlapping.

Tumbling window functions are used to segment a data stream into distinct time segments and
perform a function against them, such as the example below. The key differentiators of a Tumbling
window are that they repeat, do not overlap, and an event cannot belong to more than one tumbling
window.

Reference:

https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/stream-analytics/stream-

www.authorizedumps.com
Questions & Answers PDF Page 39

analytics-window-functions.md

Question: 21

Which counter should you monitor for real-time processing to meet the technical requirements?

A. SU% Utilization

B. CPU% utilization

C. Concurrent users

D. Data Conversion Errors

Answer: B
Explanation:

Scenario: Real-time processing must be monitored to ensure that workloads are sized properly based
on actual usage patterns.

To monitor the performance of a database in Azure SQL Database and Azure SQL Managed Instance,
start by monitoring the CPU and IO resources used by your workload relative to the level of database
performance you chose in selecting a particular service tier and performance level.

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/monitor-tune-overview

Question: 22

HOTSPOT

You plan to deploy Instance1 by using the following script.

www.authorizedumps.com
Questions & Answers PDF Page 40

You need to specify the licenseType and storagenedundancy parameters. The deployment must meet
the availability requirements and the business requirements for DB1 and DB2.

To what should you set each parameter? To answer, select the appropriate options in the answer
area.

Answer:
Explanation:

Question: 23
You need to recommend a backup solution to restore DB3. The solution must meet the availability

www.authorizedumps.com
Questions & Answers PDF Page 41

requirements. Which type of backup should you use?

A. transaction log

B. point-in-time restore (PITR)

C. differential

D. long-term retention (LTR)

Answer: C
Explanation:

Question: 24

You need to recommend which configuration to perform twice to enable access to the primary and
secondary replicas of DB3. The solution must meet the availability requirements.

What should you recommend?

A. Configure virtual network service endpoints.

B. Enable database firewall rules.

C. Create database-scoped credentials.

D. Configure connection strings that reference the read-write listener.

Answer: D
Explanation:

Question: 25

DRAG DROP

www.authorizedumps.com
Questions & Answers PDF Page 42

You need to recommend an authentication solution for App1 access to DB1 and DB2 after their
migration to Instance1. The solution must meet the availability requirements.

Which actions should you perform in sequence? To answer, drag the appropriate actions to the
correct order. Each action may be used once, more than once, or not at all. You may need to drag the
split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

Answer:
Explanation:

Question: 26
HOTSPOT

You need to recommend a service tier and a method to ofload analytical workloads for the
databases migrated from SVR1. The solution must meet the availability and business requirements.

What should you recommend? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

www.authorizedumps.com
Questions & Answers PDF Page 43

Answer:
Explanation:

Question: 27
You need to recommend a process to automate the management of DB3. The solution must meet the
management requirements. What should be the first step of the process?

A. Configure Microsoft Entra authentication for the logical server that hosts DB3.

B. Create a database that has database-scoped credentials.

C. Configure a private endpoint for connectivity to DB3.

D. Create data base-scoped credentials in DB3.

Answer: C
Explanation:

Question: 28

www.authorizedumps.com
Questions & Answers PDF Page 44

You need to identify the event_flle target for monitonng DB3 after the migration to Azure SQL
Database. The solution must meet the management requirements,

What should you use as the event_file target?

A. an Azure SQL database

B. an Azure Blob Storage container

C. a SQL Server filegroup

D. an Azure Files share

Answer: B
Explanation:

Question: 29

You need to identify the event_file target for monitonng DB3 after the migration to Azure SQL
Database. The solution must meet the management requirements.

What should you use as the event_file target?

A. an Azure SQL database

B. an Azure Blob Storage container

C. a SQL Server filegroup

D. an Azure Files share

Answer: B
Explanation:

www.authorizedumps.com
Questions & Answers PDF Page 45

Question: 30

HOTSPOT

You need to recommend which service and target endpoint to use when migrating the databases
from SVR1 to Instance1. The solution must meet the availability requirements.

What should you recommend? To answer, select the appropriate options in the answer area.

NOTE Each correct selection is worth one point.

Answer:
Explanation:

www.authorizedumps.com
Thank You for trying DP-300 PDF Demo

https://authorizedumps.com/dp-300-exam-dumps/

Start Your DP-300 Preparation

[Limited Time Offer] Use Coupon " SAVE20 " for extra 20%
discount the purchase of PDF file. Test your
DP-300 preparation with actual exam questions

www.authorizedumps.com

You might also like