Hi All,
We currently have a large number of ETL pipelines in ADF running Databricks Notebooks, which handle data transformations and audit checks etc before loading into Delta Lake.
As we're considering migrating our ADF workload from Databricks to Microsoft Fabric, is there a recommended approach or best practice for doing this?