May 14 (edited) • General
DP-600 assessment
Hi all, I am seeing the below question in the assessment (the 50-question set)
====================
You have a Fabric tenant.
Your company has 1 TB of legacy accounting data stored in an Azure Data Lake Storage Gen2 account. The data is queried only once a year for a few ad-hoc reports that submit very selective queries.
You plan to create a Fabric lakehouse or warehouse to store company sales data. Developers must be able to build reports from the lakehouse or warehouse based on the sales data. The developers must also be able to do ad-hoc analysis of the legacy data at the end of each year.
You need to recommend which Fabric architecture to create and the process for integrating the accounting data into Fabric. The solution must minimize administrative effort and costs.
What should you recommend?
A) Ingest the sales data into the Fabric lakehouse and set up a shortcut to the legacy accounting data in the storage account.
B) Ingest the sales data into the Fabric lakehouse and use a pipeline to move the legacy accounting data into the lakehouse.
C) Ingest the sales data into the Fabric warehouse and use a pipeline to move the legacy accounting data into the warehouse.
D) Set up a lakehouse with a shortcut to the legacy accounting data. Ingest the sales data into the Fabric warehouse and add the SQL analytics endpoint of the lakehouse to the warehouse for cross querying.
====================
I will jump the answer and discussion in next reply
0
1 comment
Jerry Lau
2
DP-600 assessment
Learn Microsoft Fabric
skool.com/microsoft-fabric
Helping passionate analysts, data engineers, data scientists (& more) to advance their careers on the Microsoft Fabric platform.
Leaderboard (30-day)
Powered by