User
Write something
Product & Community Updates is happening in 24 days
Data Agents
We are trying to consume Microsoft Fabric Data Agents in Copilot Studio. When we try to create a connection we are getting "Connection creation/edit of 'shared_fabricdataagent' has been blocked by Data Loss Prevention (DLP) policy". We have checked the DLP policy but do not know what we are unblocking. Does anyone have any experience?
1
0
Deployment pipelines useless?
Am I missing something with deployment pipelines? They seem to work with very few objects in Fabric. I am developing a Fabric environment with a Lakehouse, Warehouse, Copy Jobs, Pipelines, mirrored databases, etc. and when I try to promote from my dev to test (and prod) workspaces I end up having to manually recreate everything. Very few things actually deploy nicely. Other than the built in "comparison" screen, which is nice, it seems like a waste of time.
Poll
5 members have voted
Dataflows Gen2 and Pipelines Break After Ownership Transfer or Workflow Owner Leaves Organization
I have a question regarding Dataflows Gen2 and pipelines. We are experiencing an issue where workflows break when the original creator (owner) of a workflow, lakehouse, or pipeline leaves the organization. After this happens, the workflows are no longer able to read data from lakehouse tables. When investigating, it is not clear whether the issue is related to: - the lakehouse itself, - the dataflow gen2 itself, - the connection to the lakehouse, - the connection between the lakehouse and Dataflows Gen2, - or the dataflow configuration and permissions. - ...? Even after taking ownership of both the existing dataflows and the lakehouse, the problem still persists. In addition, creating a brand-new Dataflow Gen2 does not resolve the issue: when we try to read data from a specific lakehouse table, the dataflow fails during refresh with the same behavior. Is there a known reason why this issue affects both existing and newly created dataflows, and is there a recommended way to properly restore lakehouse connectivity or reconfigure permissions to resolve this problem? Thank you!
Issue with Dataflow Gen2
I have a question, I want to ingest data from API's where I have to call multiple API's based on issue ID's to get the history from an issue and this history data is pretty long. To make my question/issue clear I don't want to give to many details about how I get the data but my setup is as this: I want to create a Dataflow Gen 2 that gets the data based on a watermark (date) value that I get from a SQL table (like for example yesterday 12/2/2025). so I created a scalar date value that I sould be able to use in my query to get only te new record. So in the query where I want to setup my table on the last step I add a filter to get all values from the WM date or after like this: Table.SelectRows( IssuesTyped, each [updated_at] >= WM ) In my powerquery I do get the results so I see for this example all history of the issues that are updated yesterday and today. But when I want to ingest this table into my datawarehouse I get a error: Error Code: Mashup Exception Expression Error, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Expression.Error: Failed to insert a table., InnerException: We cannot apply operator < to types Table and Date., Underlying error: We cannot apply operator < to types Table and Date. Details: Reason = Expression.Error;ErrorCode = Lakehouse036;Message = We cannot apply operator < to types Table and Date.;Detail = [Operator = "<", Left = error "Microsoft.Mashup.Engine1.Runtime.ValueException: [Expression.Error] Value was not specified.#(cr)#(lf) at Microsoft.Mashup.Engine1.Language.ValueCreator.CreateValueForThrow(IThrowExpression throwExpr)#(cr)#(lf) at Microsoft.Mashup.Engine1.Language.ValueCreator.<>c__DisplayClass23_0.<CreateValueForRecord>b__0(Int32 index)#(cr)#(lf) at Microsoft.Mashup.Engine1.Runtime.RecordValue.DemandRecordValue.get_Item(Int32 index)#(cr)#(lf) at Microsoft.Data.Mashup.ProviderCommon.MashupResource.TryGetValue(Func`1 getValue, IValue& value, String& errorMessage)#(cr)#(lf)Record", Right = #date(2025, 3, 5)];Message.Format = We cannot apply operator #{0} to types #{1} and #{2}.;Message.Parameters = {"<", "Table", "Date"};ErrorCode = 10051;Microsoft.Data.Mashup.Error.Context = User
Fabric Mirrored DB as copyjob source?
Has anyone manged to configure a copyjob using a mirrored (Azure SQL) db as a source? We finally got mirroring to work into our Bronze Layer and now want to copy to Silver but cannot see an option. We can run sql against the mirror end point just fine...
1-30 of 823
Learn Microsoft Fabric
skool.com/microsoft-fabric
Helping passionate analysts, data engineers, data scientists (& more) to advance their careers on the Microsoft Fabric platform.
Leaderboard (30-day)
Powered by