0% found this document useful (0 votes)
105 views7 pages

DP 700 Demo

The document contains a demo of Microsoft DP-700 Exam Questions and Answers related to implementing data engineering solutions using Microsoft Fabric. It includes questions on data access permissions, methods for populating lakehouse medallion layers, and SQL code completion for product dimensions. Each question is accompanied by explanations and correct answers to guide users preparing for the exam.

Uploaded by

chirag nayak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
105 views7 pages

DP 700 Demo

The document contains a demo of Microsoft DP-700 Exam Questions and Answers related to implementing data engineering solutions using Microsoft Fabric. It includes questions on data access permissions, methods for populating lakehouse medallion layers, and SQL code completion for product dimensions. Each question is accompanied by explanations and correct answers to guide users preparing for the exam.

Uploaded by

chirag nayak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Questions and Answers PDF 1/7

Thank You for your download

om
Microsoft DP-700 Exam Question & Answers (Demo)
Implementing Data Engineering Solutions Using

.c
Microsoft Fabric Exam

ps
m
du
am
ex
id
al
.v
w
w
// w
s:
tp
ht

https://www.validexamdumps.com/DP-700.html
Questions and Answers PDF 2/7

Version: 4.0

Question: 1

om
You need to ensure that the data analysts can access the gold layer lakehouse.
What should you do?

.c
ps
A. Add the DataAnalyst group to the Viewer role for WorkspaceA.

m
B. Share the lakehouse with the DataAnalysts group and grant the Build reports on the default

du
semantic model permission.
C. Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data
am
permission.
D. Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.
ex
id

Answer: C
al
.v
w

Explanation:
w
w

Data Analysts' Access Requirements must only have read access to the Delta tables in the gold layer
//

and not have access to the bronze and silver layers.


s:
tp

The gold layer data is typically queried via SQL Endpoints. Granting the Read all SQL Endpoint data
permission allows data analysts to query the data using familiar SQL-based tools while restricting
ht

access to the underlying files.

Question: 2

HOTSPOT

You need to recommend a method to populate the POS1 data to the lakehouse medallion layers.

What should you recommend for each layer? To answer, select the appropriate options in the answer
area.

NOTE: Each correct selection is worth one point.


Questions and Answers PDF 3/7

om
.c
ps
m
Answer:
Explanation:
du
am
ex
id
al
.v
w
w
// w
s:
tp
ht

Bronze Layer: A pipeline Copy activity


The bronze layer is used to store raw, unprocessed dat

a. The requirements specify that no transformations should be applied before landing the data in this
layer. Using a pipeline Copy activity ensures minimal development effort, built-in connectors, and
the ability to ingest the data directly into the Delta format in the bronze layer.

Silver Layer: A notebook


The silver layer involves extensive data cleansing (deduplication, handling missing values, and
standardizing capitalization). A notebook provides the flexibility to implement complex
transformations and is well-suited for this task.

Question: 3
Questions and Answers PDF 4/7

requirements.

What should you do?

A. Create a workspace identity and enable high concurrency for the notebooks.
B. Create a shortcut and ensure that caching is disabled for the workspace.
C. Create a workspace identity and use the identity in a data pipeline.
D. Create a shortcut and ensure that caching is enabled for the workspace.

om
Answer: B

.c
ps
Explanation:

m
du
To ensure that the usage of the data in the Amazon S3 bucket meets the technical requirements, we
must address two key points:
am
Minimize egress costs associated with cross-cloud data access: Using a shortcut ensures that Fabric
does not replicate the data from the S3 bucket into the lakehouse but rather provides direct access to
ex

the data in its original location. This minimizes cross-cloud data transfer and avoids additional egress
costs.
id

Prevent saving a copy of the raw data in the lakehouses: Disabling caching ensures that the raw
al

data is not copied or persisted in the Fabric workspace. The data is accessed on-demand directly
.v

from the Amazon S3 bucket.


w
w

Question: 4
// w
s:

HOTSPOT
tp

You need to create the product dimension.


ht

How should you complete the Apache Spark SQL code? To answer, select the appropriate options in
the answer area.

NOTE: Each correct selection is worth one point.


Questions and Answers PDF 5/7

om
.c
ps
m
du
am
ex
id

Answer:
al

Explanation:
.v
w
w
// w
s:
tp
ht
Questions and Answers PDF 6/7

The goal is to include only products that are assigned to a subcategory. An INNER JOIN ensures that
only matching records (i.e., products with a valid subcategory) are included.

Join between ProductSubCategories and ProductCategories:


Use an INNER JOIN.
Similar to the above logic, we want to include only subcategories assigned to a valid product
category. An INNER JOIN ensures this condition is met.

WHERE Clause
Condition: IsActive = 1

om
Only active products (where IsActive equals 1) should be included in the gold layer. This filters out
inactive products.

.c
Question: 5

ps
m
You need to populate the MAR1 data in the bronze layer.

du
Which two types of activities should you include in the pipeline? Each correct answer presents part
am
of the solution.
ex

NOTE: Each correct selection is worth one point.


id
al
.v

A. ForEach
w

B. Copy data
C. WebHook
w

D. Stored procedure
// w
s:

Answer: AB
tp
ht

Explanation:

MAR1 has seven entities, each accessible via a different API endpoint. A ForEach activity is required
to iterate over these endpoints to fetch data from each one. It enables dynamic execution of API calls
for each entity.

The Copy data activity is the primary mechanism to extract data from REST APIs and load it into the
bronze layer in Delta format. It supports native connectors for REST APIs and Delta, minimizing
development effort.
Questions and Answers PDF 7/7

Thank you for your visit.


To try more exams, please visit below link
https://www.validexamdumps.com/DP-700.html

om
.c
ps
m
du
am
ex
id
al
.v
w
w
// w
s:
tp
ht

You might also like