Activity
Mon
Wed
Fri
Sun
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
Feb
What is this?
Less
More

Memberships

Learn Microsoft Fabric

15.3k members • Free

9 contributions to Learn Microsoft Fabric
increasing Capacity
Hello all, i wanted to get professional opinion on upgrading our Capacity. so, here is what we currently have : we are currently on an F16 capacity with 10 users, we have around 30 dataflows with small amount of data extracted from Business central that run 4 times a day(takes 15 minutes in total), we have 2 dataflows with medium load, that run 6 times a day(each dataflow takes 40minutes approximately), we have 3 pipelines that run every 3 minutes, those pipelines do an incremental refresh getting data from dataverse, each run takes 30seconds-1min, we have 6 reports, all direct query, and multiple paginated reports created and extracted daily by multiple users, we currently have in total in our fact table around 70 columns, nearing 30 million records increasing daily by ~ 1.8m to 2m. the average CU usage is 50-70%, and sometimes when multiple paginated reports are being created and extracted, the throttling % goes over the 100% limit. we have over 30 new users pending to have access on the reports and paginated reports. i was aiming to increase the capacity to F64, for multiple reasons being reducing capacity usage; having some breathing space for the capacity with the CPU/memory increase, and the new and existing users wont need a license since on F64 users with viewer access can view the reports and paginated reports without a license. is that a good option to go with? would love to hear any professional's perspective on this! thank you in advance!
Incremental load Dataflow
I have a dataflow that i used to extract data from business central. i want to apply incremental refresh on the dataflow, i setup the refresh with the following settings : column to filter by: PostingDate extract data from the past 1 week bucket size: day only extract data when the maximum value in this column changes : LastModifiedDateTime. the user updated records from Jan 2025, but the records didnt update in my dataflow. can anyone give me guidance on how i should setup my dataflow's incremental refresh so it updates whenever any record changes?
0
0
Incremental Refreshes in Dataflow
Hello, i wanted to know a couple of things about the incremental refresh in a Dataflow Gen2. so i implemented an incremental refresh on my tables in a dataflow, i have 8 tables there.i set it with the following settings : extract data from the past 1 week, bucket size is 1 day and only extract data when lastmodifieddatetime value changes. the tables usually take around 30min to refresh without incremental refresh. when i ran the dataflow first time it took 30 minutes, but after the first time it still was taking 30 minutes? i am not sure if i am doing anything wrong but shouldn't the refresh time be less than the usual refresh time? also one more thing, is it better to have a seperate dataflow for each table or is it ok if they're all in 1 dataflow? thank you in advance! Update : the dataflow was refreshing and all processes went perfectly, but then the dataflow failed on a "writing to destination" activity for some of the tables with the below error : There was a problem refreshing the dataflow: "The refresh for this entity couldn't be executed because the user has reached the evaluation quota. Try again later or consider reducing the overall evaluation usage of the user". Error code: 999999. what does that impose to?
microsoft Fabric Subscription
hello, so i received an email, we were trying to subscribe to a fabric license, and in the email i got this message : We are pleased to inform you that your order has been successfully completed. Your subscription(s) has been activated and your Azure New Commerce Experience (NCE) service is ready to use. This email is to inform you that your plan account is active, and provides you with some very important information. as well as a section called "Your Control Panel Login Details" and another called "Your Subscription Information". but when i tried to access the workspace that we had worked on using fabric trial license, and tried accessing the data pipelines and notebooks, i got this message: "We couldn't find the page you were looking for" with technical details provided under it. do i need to do anyting to activate the fabric license or what is the issue exactly?
Pipeline not working properly
hi, i have a pipeline that refreshes my data with the following: the pipeline first runs 2 dataflows to refresh them, on success, an ETL notebook runs. the pipeline refreshes on my dataflow were working perfectly fine, but now, they dont refresh and i get an error, they only refresh if i do it manually or schedule the dataflows alone without the pipeline. here is the error i got: Query: Error Details: We encountered an error during evaluation. Details: Unknown evaluation error code: 999999 this started happening after our free trial ended, so we bought a license but for some reason now when we try to open the power bi report, it says we need to buy a license
1-9 of 9
Hussein El Charif
2
11points to level up
@hussein-el-charif-3678
BI Developer

Active 30d ago
Joined May 12, 2025
Powered by