cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
arpost
Advocate V
Advocate V

Anyone figured out a nifty way to pass parameters to a dataflow?

I love Power Platform dataflows but am finding the lack of a formal way to pass parameters to them rather frustrating. I was wondering if the community has come up a nifty workaround for implementing a parameter-like functionality. Anyone have some brilliant ideas?

 

One of my use cases is I want to trigger a dataflow on a file when a person pushes a button in an app, but the dataflow needs to then interact with that SPECIFIC file in a folder and no others. I've got a Power Automate flow set up to refresh the dataflow.

22 REPLIES 22

Thanks, great idea. 👏🏻 I implemented the same. I'll monitor whether this keeps working, so far so good. 🤞🏻

good job!

mines been running for about a month, so far so good 🙂

@Cam I was about to do the same ('Job Queue') today, then ended up here while searching for other options. Would you still recommend your solution or did you find some drawbacks? Thank you!

Anonymous
Not applicable

Hi @Cam , appreciate this is an old post now but we have run into the same problem. Do you remember roughly how you implemented this queue? I keep running into issues.. First thought was:
1. Have a cloud flow that picks up spreadsheet from sharepoint folder and imports to a pre-staging table
2. Insert a record into a dataflow queue table with a unique guid for the file imported etc. and with a status of Awaiting Processing or something
3. Then have a new cloud flow that triggers upon a record being inserted into this queue table that essentially just sets the status to processing and refreshes the dataflow (this preps the pre-staging table data and does loads of processing and inserts into new dataverse table)
4. Now have another cloud flow with a trigger on when the dataflow refresh completes.. This is the bit I'm unsure how to link the dataflow back to the queue.. I could just grab the records where the status is processing but what happens if 2 files get uploaded - there would be 2 being run concurrently. Did you somehow include the Queue data in the dataflow? I have found no way of doing this. 

Many thanks for any help

Cam
Kudo Commander
Kudo Commander

@Gianluca 

my Proof of Concept has been going strong for 3 months.

my queue processes ~10 rows per week at the moment.

seems stable for this small number of items.

 

i've been curious recently, if i could use 'Azure Queue Storage' as an alternative solution.

eg. each dataflow gets it own queue, when the dataflow runs, it pop's the next item out of the start of the queue and reads all of its parameters from that item.

 

Info and Connector

https://learn.microsoft.com/en-us/azure/storage/queues/storage-queues-introduction

https://learn.microsoft.com/en-us/connectors/azurequeues/

 

I'm not sure how best to implement this into a Power Automate flow,

but i'm sure this would be a fun challenge 🙂

@Anonymous 

for my POC, the queue table looks like this
(i will be cleaning up my POC in the coming weeks, to build the production solution - design at end of post)

 

Cam_2-1673323332457.png

for the POC, i'm just storing GUID's from other tables into the single line of text field.
Bot Log ID is the main parameter that i pass to the Dataflow (if you had many parameters to pass, then you could create a single column called 'parameters', put a JSON object into it, and parse the JSON within the Dataflow to get the parameter values)

Cam_4-1673323952978.png

 

 


here is my dataflow, notice the Parameter group folder on the left, which gets the rows from the queue, filters for the top row, and gets the parameters from it.

these parameters are then used by the Get Job ID and Get Bot ID functions.. and they are used in the following ETL queries.

 

Cam_5-1673324061205.png


i have 3x cloud flows which handle processing of the queue (this is serious overkill, i will simplify this in the production solution) [please ignore Ingest - wait for dataflow... that one is used by a parent flow for something else]
Cam_6-1673324197570.png




Dataflow Scheduler: triggers every 5 minutes, checks if there are any new 'jobs' (rows) in the dataflow queue table. if there is a job then it checks if the dataflow from that job is busy or not. if it is not, then it starts the dataflow. note: the queue is used by multiple dataflows... its up to you if you have one queue per dataflow or one queue which handles all dataflows.

Cam_7-1673324636096.png



Ingest - End Dataflow monitors the DataflowRefreshHistories table (default table in CDM).
when a new row is added to the table, it will update the associated record in the queue table,
and copy the success/failure status into the queue table. it also removes the next in queue flag.

Cam_8-1673324903089.png

 

Ingest - Start Dataflow just adds a new row into the queue, and saves the variables into the row

Cam_10-1673325079813.png

 

 

@Gianluca @Anonymous if you could share how you've created your queues / queue-monitor-flow / dataflows that would be awesome to see how you've done this?

hello @bbarka 

 

"4. Now have another cloud flow with a trigger on when the dataflow refresh completes.. This is the bit I'm unsure how to link the dataflow back to the queue.. I could just grab the records where the status is processing but what happens if 2 files get uploaded - there would be 2 being run concurrently. Did you somehow include the Queue data in the dataflow? I have found no way of doing this. "

 

i ran into this same issue.

i made the queue work on two assumptions:

- each dataflow can only be executed once at a time (i prevent concurrent execution of the same dataflow, by waiting until the dataflow is idle before executing it)

- yes the dataflow itself uses its GUID to filter the queue, and then it gets the first row, and gets its parameters from that row.

Anonymous
Not applicable

@Cam this is excellent stuff and exactly what I need, thank you so much! I was getting hung up worrying about if they drop lots of files in the sharepoint folder together - as currently this will set off 10 x cloud flows which each refresh the dataflow but I can see that you are instead scheduling them to run every 5 mins and then ensuring run synchronously. Seems like a really good approach. I only have one dataflow but this could be run multiple times per day and the dataflow could update 000s records each time. At the moment I have one cloud flow that does the following:
1. Triggers on file added to sharepoint folder
2. Creates record in dataverse to say file has been uploaded
3. Reads header data from file
4. Does a foreach on main data in excel file and copies to pre-staging dataverse table with status = NotProcessed
5. Refreshes dataflow (this gets data from pre-staging table with status=NotProcessed and does a lot of processing/ETL with this data and inserts this into another dataverse Staging table)
6. Another trigger in same cloud flow on refresh completing of that dataflow (this works bizarrely having another trigger further down the cloud flow - but I need in same flow to know which records to update in pre-staging table below)
7. Update all prestaging records to be status=Processed where same file

The above actually works ok but problem is I have to put 1 min delays between step 4-5 or sometimes it will kick off the dataflow refresh before all the records are in the prestaging table.. I also need a 1 min delay after the dataflow refresh complete trigger (between step 6-7) as bizarrely it appeared that even though the trigger was when the refresh completed - it was like sometimes it thought it had completed but it hadn't actually!
The other problem with my approach is that the dataflow is being refreshed everytime a file is uploaded which isn't strictly necessary e.g. if they upload 10 files then it would be better to put all 10 files in pre-staging table and then refresh the dataflow once. 
Yours seems better approach as not relying on these triggers and having to put 1 min delays in hurts my sole and feels flaky as hell. Thank you I will have a go at implementing this. Like you say would be interesting to see @Gianluca approach too.

@Anonymous Hi. Unfortunately, I had to deprioritize this due to work-load. I might come back to this in Q2 2023 (I hope...).

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!   This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions in the Forums to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber of SolutionsSuper UsersNumber of Solutions @anandm08  23 @WarrenBelz  31 @DBO_DV  10 @Amik  19 AmínAA 6 @mmbr1606  12 @rzuber  4 @happyume  7 @Giraldoj  3@ANB 6 (tie)   @SpongYe  6 (tie)     Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersSolutionsSuper UsersSolutions @anandm08  10@WarrenBelz 25 @DBO_DV  6@mmbr1606 14 @AmínAA 4 @Amik  12 @royg  3 @ANB  10 @AllanDeCastro  2 @SunilPashikanti  5 @Michaelfp  2 @FLMike  5 @eduardo_izzo  2   Meekou 2   @rzuber  2   @Velegandla  2     @PowerPlatform-P  2   @Micaiah  2     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 3:Community MembersSolutionsSuper UsersSolutionsPower Apps anandm0861WarrenBelz86DBO_DV25Amik66Michaelfp13mmbr160647Giraldoj13FLMike31AmínAA13SpongYe27     Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 4:Community MembersSolutionsSuper UsersSolutionsPower Apps DBO-DV21WarranBelz26Giraldoj7mmbr160618Muzammmil_0695067Amik14samfawzi_acml6FLMike12tzuber6ANB8   SunilPashikanti8

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages

Users online (1,662)