Activity
Mon
Wed
Fri
Sun
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
What is this?
Less
More

Memberships

Team LMF Management

2 members • $5,997

Fabric Dojo 织物

339 members • $447/y

Learn Microsoft Fabric

12.7k members • Free

85 contributions to Learn Microsoft Fabric
Dataflow recovery
Is there a way to recover a corrupted dataflow? We recently had some issues and had to rebuild Azure environments. Now when I sign into Fabric and attempt to use my previously created dataflows i cannot even open them
0 likes • 2d
@Callum Doyle great! Are you able to see your data in the JSON file? I haven't tried this approach yet but if the JSON file has data you can create data flow gen 1 from that JSON file and then create data flow Gen 2 from it. I can give this a try when I get a chance or if you try first let me know how it goes!
0 likes • 1d
@Callum Doyle the JSON file shouldn't need decryption. Can you DM me? I can take a look.
Fabric to BW connection
I have the requirement to connect to SAP BW using fabric notebook . I am good with fabric UI and spark . Can someone help me in this how can I set a conection fetch the underlying data using fabric notebook and write to warehouse. My manager gave my a query like BW_QRY_CPH***HD*_0001( i have masked the query for security) how can I set this connection from fabric to BW
0 likes • 2d
Hi @Rahul Sharma just to confirm, what do you mean by BW? Is this SAP BW?
INFOR LN → Lakehouse pipeline configuration via JDBC
Hello everyone, i'm working on a data migration project from INFOR LN to our lakehouse. I would like your feedback on: - Recommended JDBC drivers for INFOR LN - Extraction strategies (full vs incremental) - Possible performance and optimizations - Challenges encountered during implementation Thanks in advance for your advice!
1 like • 3d
@Azoben Sadio These questions are quite specific to your use case - data size, compute available on fabric capacity, what tech stack are your devs more comfortable with, etc. Looks like Infor ln have their own set of odbc drivers you can use to make a connection, from there it depends on the dev team if they're more comfortable with Data flows or Notebooks for ingesting this data into your Fabric Lakehouse/Warehouses. The optimization strategies then depend on the size of the data and the complexity of the solution you'd implement. Hope this helps!
I passed my DP-600!
Hey everyone, last week i passed my DP-600 with a score of 874!! I am extremely grateful for you resources @Will Needham 🫶
0 likes • 3d
@Nomusa Lembede congratulations ! Onto DP700 then? 😃
Is there a way to replace tables in PBI?
I have a PBI report, that uses a key table, let's call it X. I need to replace this table with another table with the exact same format/column names, is there a way to do this? In Tableau there's a thing called "Replace Data Source" I looked up and there doesn't seem to be a way to do this in PBI. Has anyone faced this use-case?
0 likes • 3d
Hi @Ravi Dhanwani What do you mean by replace? Is it coming from the same source? If so, you can just ingest table from transform data in Desktop and remove the previous table and then establish the same relationships in the model. Let me know if theres certain caveat in your scenario where this might not be possible, thanks!
1 like • 3d
@Ravi Dhanwani yes! In the M query of the table. Look for the source query and re write it to bring from the new source and see if your other proceeding queries still work the same.
1-10 of 85
Sambhav R
4
63points to level up
@sam-r-9845
Fabric FTW. MS CSS Technical Advisor. DP 600 and DP 700 certified.

Active 1h ago
Joined Jan 18, 2025
Canada
Powered by