Activity
Mon
Wed
Fri
Sun
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
What is this?
Less
More

Memberships

AI Automation Agency Hub

248.9k members • Free

Data Alchemy

37.5k members • Free

Learn Microsoft Fabric

12.7k members • Free

4 contributions to Learn Microsoft Fabric
How you manage your study time?
Hello, I was just wondering how people manage their study time between work and family. Do you study before/after work, maybe you study in the evening til quite late after putting the kids to bed? Do you stay late at work and study after hours before going home? How do you deal with the frustration of not getting in any study for a few days because of life commitments? Do you have an arrangement with your family that study time is study time? I'd be really interested in how any of you shape or organise your busy lives around study time.
0 likes • May 29
Schedule your focused time slots arranged at a time and place where you have no distractions. Its all about keeping distractions at bay, including silenced notifications on phone as well for that time.
Architecture Advice Needed : Manage historical Data
Hi everyone! 😊 We’re in the process of migrating our existing Azure SQL-based data warehouse to Microsoft Fabric, and I’d really appreciate some advice or suggestions from this amazing community. I’ve learned a lot from the posts and guidance shared here already, and I’m truly thankful for all the insights and support people offer. Current Architecture: - We use Azure SQL DB as our data warehouse. - Data is staged from IBM DB2, then typed and incrementally loaded into SQL. - We have dim and fact views built on top, which are consumed by SSAS models and Power BI. - For historical data, we archive older views into tables (refreshed yearly), while the most recent 2 years remain live for easier access. - Planned Migration to Microsoft Fabric: We're moving to a Lakehouse model in Fabric, with the following structure: - Bronze: Raw files stored as Parquet. - Silver: Cleaned and type-casted tables. - Gold: Business logic applied tables— targeting Direct Lake(not sure) for Power BI reporting. - Challenges and Questions: Some of our business logic involves updates and merges, and we’re concerned about performance with large tables (some with 48 million rows). The business rules we need to apply include: - Converting all sales values to GBP - Finding and applying the latest product cost - Updating rebate percentages, which often change retroactively over the past 7–8 months Also, we’re trying to figure out how to best handle historical data in Fabric — our current method of archiving into tables and keeping only recent data live (via views) has worked well. What I’d love your input on: - What’s the best way to handle these kinds of business rules in Fabric without performance bottlenecks? - Should we move this logic to Dataflows Gen2, Notebooks, Lakehouse SQL, or something else? - Any best practices for managing historical data in Microsoft Fabric? Any advice, war stories, or examples would be hugely appreciated! Thanks again to everyone who shares their knowledge here — it really helps people like me navigate these big changes. 🙏
1 like • May 29
Spark notebooks for sure with delta lake and partitioning
hi did any one try the applied skill assessment?
I am stuck in the puzzle test... can't pass to create the fabric account, wth
0 likes • May 29
I got a puzzle to align the 2nd image to the arrow shown in the first image.. And using the + and - signs, I was able to change the direction of a house and align it and then the sandbox opened.
Cleared DP700
Dear Will and all Fabric enthusiasts, I was able to clear DP 700 today by a whisker 😅. I got very close to clearing DP600 in last week of Dec but fell short of just 25 points. But I guess that experience got me to achieve DP700 today ✌️. I went through the Microsoft Learn curated content and solved labs for the ingesting data section only (first of the 5 learning paths) since last week using dataflow, data pipeline and Real time data ingestion. I was running out of time and then listened to the first few and the last few sections from Will's 6-hour course just 24 hours before the exam. Skipped the Real time sections since I had done the RTI event stream and event house labs. The monitoring and optimization sections from Will's 6-hour video were really helpful. The strategy to reserve 30 minutes for case study really helped me utilize my time well during the exam (I was short of time during dp600!). I didn't have any time left in the end and I also had to rush with the case study. I had made up my mind to not refer the MS Learn content during exam. No point wasting time. Had it not been for the Will's playlist, I couldn't have finished the necessary knowledge acquisition just in time, and confident enough to appear for the exam and probably clear it as well today (without doing the DP700 practice test available even once 😂). So, thanks much Will for the perfect playlist! You are blessed with a tone of voice that doesn't get boring. Of course, I have also led data teams earlier to deliver some projects with ADF and Databricks with Data vault and SCD type 2 and the data foundations from this experience helped as well. I would suggest others to go through Will's playlist; Read through Microsoft Learn content (5 learning paths) and work on the practical labs with the fabric free trial, do the free practice tests from Microsoft and you should be good enough to clear the Exam. Reviewing latest YouTube clips with some question banks will help as well. There are some questions where you will be able to recollect the right answer easily if you have experience of performing those activities in Fabric. So, use the Fabric free trial well.
0 likes • May 20
Thanks all :)
1-4 of 4
S Amin
2
9points to level up
@s-amin-5546
Wannabe Data Fabricator

Active 64d ago
Joined Dec 16, 2024
Bengaluru
Powered by