Activity
Mon
Wed
Fri
Sun
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
What is this?
Less
More

Memberships

Learn Microsoft Fabric

12.7k members • Free

29 contributions to Learn Microsoft Fabric
Curious to hear from the community
What will Fabric look like three years from now—and how should we prepare for what’s coming?
1 like • Jul 28
fully automated with AI Agents integrated :)
Introducing OneLake events (preview) 👀
Today, Microsoft announced a new feature that definitely caught my eye - this will be really useful, OneLake events. A new event will trigger every time a OneLake file or folder changes - it can be used to trigger a Data Pipeline! Previously, folder-based triggers could only be setup on Azure Blob Storage folders, now you also get it with OneLake files ad folders. This is bring in lots of interesting new automation workflows - how are you going to use this? Read more here: https://blog.fabric.microsoft.com/en-us/blog/unlocking-the-power-of-real-time-data-with-onelake-events?ft=All
Introducing OneLake events (preview) 👀
0 likes • Jul 28
@Will Needham - In my project we have a requirement to implement event driven triggers. requirement as below, 1. files are placed by client in client shared drive which is out side of fabric. 2. for every new file entry in the share drive an event should trigger the data pipeline and take that new file from the share drive (which is outside of fabric) and process through medallion architecture and load into target system. disclaimer from customer: Azure is out of scope for this project. everything should be implemented inside the fabric environment. Workarounds: 1. we have written a python code to run from fabric notebook and trying to access the file which is in share drive. but not able to read files from outside of fabric. 2. we tried SAS token option, but that has to be created in the Azure portal which is out of scope for this project 3. Azure event hub is another approach, same issue azure is out of scope. Requesting anyone from the community to provide the suitable solution for this issue. many thanks in advance.
upload .csv file from visual studio code using python into Microsoft fabric lakehouse file system
I would like to upload .csv file from visual studio code using python into Microsoft fabric lakehouse file system. I have tried REST API and SDK options, but facing issues. Can someone help me on this issue. if you have any approach to on the subject line requirement?
1 like • Jul 23
@Will Needham - below code snippet is working to load the file into lakehouse from python. import os import requests from dotenv import load_dotenv load_dotenv('.env') # # Step 1: Load connection details from environment variables # TENANT_ID = os.getenv('fabric_tenant_id') # CLIENT_ID = os.getenv('FABRIC_CLIENT_ID') # CLIENT_SECRET = os.getenv('FABRIC_CLIENT_SECRET') # LAKEHOUSE_URL = os.getenv('FABRIC_LAKEHOUSE_URL') # DESTINATION_PATH = os.getenv('FABRIC_FILE_DESTINATION_PATH') # LOCAL_FILE_PATH = os.getenv('FABRIC_LOCAL_FILE_PATH') # print("TENANT_ID:", TENANT_ID) # print("CLIENT_ID:", CLIENT_ID) # print("CLIENT_SECRET:", CLIENT_SECRET) # print("LAKEHOUSE_URL:", LAKEHOUSE_URL) # print("DESTINATION_PATH:", DESTINATION_PATH) # print("LOCAL_FILE_PATH:", LOCAL_FILE_PATH) TENANT_ID = "ce8255f5-8eda-41ae-8a43-ee9497702f4c" CLIENT_ID = "c1023f62-70f8-47ff-8efe-74a30fe28900" CLIENT_SECRET = ".zx8Q~xXUuoxj1qEbnqgurLuG3TZ8XK5nV1EtaSe" LAKEHOUSE_URL = "https://onelake.dfs.fabric.microsoft.com/Fabric_datapipeline1609/DP002_LH.Lakehouse/Files" DESTINATION_PATH = "Dataset_Account_Master_BusinessGlossary.xlsx" LOCAL_FILE_PATH = "D:\\Metadata-copilot\\Connect_Lakehouse_Python\\my_env\\Dataset_Account_Master_BusinessGlossary.xlsx" # Step 2: Authenticate using OAuth2 def get_access_token(tenant_id, client_id, client_secret): token_url = f"https://login.microsoftonline.com/{TENANT_ID}/oauth2/v2.0/token" token_data = { "grant_type": "client_credentials", "client_id": CLIENT_ID, "client_secret": CLIENT_SECRET, "scope": "https://storage.azure.com/.default" } headers = { "Content-Type": "application/x-www-form-urlencoded" } response = requests.post(token_url, data=token_data, headers=headers) # print(response.json()) # Debug the token response
Connect to Fabric SQL DB from Fabric notebook using pyspark
Hello Team, I have a requirement to establish the connection to Fabric SQL DB from Fabric notebook using pyspark code. below are the steps I have tried, 1. created a Fabric SQL DB (SQLDB1609), Schema (SalesLT), table (Address) 2. created a notebook and trying below code snippet # Define connection properties server_name = "6vkyftw2r2xedcsd52kjo4bpjq-s343udatgcyebnur2gormglqku.database.fabric.microsoft.com,1433" database_name = "SQLDB1609-e46bc349-4c85-43ae-87b8-08f6793bc4f9" # JDBC URL jdbc_url = f"jdbc:sqlserver://{server_name}:1433;databaseName={database_name};encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30;" I am not able to establish the connection, can someone help me with the different approaches to "connect to Fabric SQL DB from fabric notebook using pyspark code".
0 likes • Jul 18
@Pavan Kumar Okay, let me check
0 likes • Jul 23
@Pavan Kumar after creating the username and password, still getting the TCUP/IP network issue. No response from the server. mostly server team need to check on whitelisting the host and port.
June Leaderboard Winners ! 🥇🥈🥉 🥳
A big congrats to the following top contributors in the community for the month of June: 🥇 @Yu Zheng 🥈 @Abdelhak ㅤ 🥉 @Richard Pařík Thank you all for your contributions to the community, you have all won a free pass to Fabric Dojo for the month of July🥳 If you're new to the community, every month, we give away three one-month passes to Fabric Dojo for the top 3 contributors in the community, as per the 30-day leaderboard! How do you climb the leaderboard? Read this. Well done to everyone 💪
June Leaderboard Winners ! 🥇🥈🥉 🥳
1 like • Jul 8
Congratulations winners💐
1-10 of 29
Venkat Karudumpa
3
26points to level up
@venkat-karudumpa-3664
Skilled proficient with 13 years of involvement in IT industry out of which 3+ years of experience in building data engineering workflows.

Active 54d ago
Joined Apr 28, 2024
Powered by