I love Power Platform dataflows but am finding the lack of a formal way to pass parameters to them rather frustrating. I was wondering if the community has come up a nifty workaround for implementing a parameter-like functionality. Anyone have some brilliant ideas?
One of my use cases is I want to trigger a dataflow on a file when a person pushes a button in an app, but the dataflow needs to then interact with that SPECIFIC file in a folder and no others. I've got a Power Automate flow set up to refresh the dataflow.
i've had some people ask how i'm ingesting a file into the dataflow... so i'm leaving this here.
hopefully this helps someone
===========
I have a dataverse table, with a File column, and the binary file is stored in that column.
the dataflow gets the binary content of that file, using the web connector, and the dataverse API.
here is an example file (this is a .CSV.... but works with other types of files too), uploaded to a table
this is your source step, in your dataflow.
use the Web.Contents connector (green) with the Dataverse API endpoint (red) in order to get the file's binary content (blue)
once you have the files binary content (blue)... you can transform it as per usual methods.
for example: in this next step, i open the file as a .CSV, and the following steps do filtering/sorting/stuff
hope this helps
have a nice day
thank you
Hi all, I've recently built a solution with a similar use case to OP, and others in this thread.
My solution consists of:
This is my solution at a high level.
When a file is created in the SharePoint library, it triggers the Dispatcher cloud flow. The dispatcher flow grabs some metadata from the file and creates an item in the Work Queue. This has concurrency control set to 1, to ensure that 1 file always results in 1 work queue item. For my use case, this isn’t a performance issue as there are only 3 csv files created in the SP library, once every 24 hours.
Here is an example of the Work Queue Item which the Dispatcher Flow creates in the Work Queue. The link to the SharePoint file, and some other metadata is included as a JSON string in the ‘input’ column of the Work Queue item. The beauty of this is that you can easily modify the dispatcher flow to include as much metadata as you like, without affecting the logic of anything. You can then use that metadata in subsequent steps, for logging etc.
When a new item is created in the Work Queue, it triggers the Performer cloud flow. The Performer cloud flow has trigger conditions:
The trigger conditions are important if you have multiple Work Queues, and/or dispatchers and performers in your environment – if not implemented, you’ll have a mess of flows triggering based on the wrong event.
Once triggered, the Performer flow calls the relevant Power Apps Dataflow to refresh – and importantly – waits for it to finish refreshing. If you don’t have this wait-until-refresh-completes-step you would need to include a delay action, otherwise you will likely have issues where a Performer flow attempts to refresh a dataflow whilst it is already mid-refresh.
Here is the step within the Power Apps Dataflow which queries the Work Queue Items Dataverse table, filters based on the ID of the Work Queue, and for items with processing status.
Once I find the Work Queue Item I want, I use the json() function to extract the link to the SharePoint file.
It is worth noting here that the Dataflow gets upset if you do not trim the parameters from the URL.
Now that I have the URL to the SharePoint file, I run a separate query to extract and manipulate the data from the actual .csv file.
Finally, I load this into the Dataverse tables that I ultimately present to the user via a Model Driven App. Full script below for reference.
Hope this is helpful to someone! Would love to hear from anyone on how this approach could be improved or otherwise.
Hi all, I've recently built a solution with a similar use case to OP, and others in this thread.
My solution consists of:
This is my solution at a high level.
When a file is created in the SharePoint library, it triggers the Dispatcher cloud flow. The dispatcher flow grabs some metadata from the file and creates an item in the Work Queue. This has concurrency control set to 1, to ensure that 1 file always results in 1 work queue item. For my use case, this isn’t a performance issue as there are only 3 csv files created in the SP library, once every 24 hours.
Here is an example of the Work Queue Item which the Dispatcher Flow creates in the Work Queue. The link to the SharePoint file, and some other metadata is included as a JSON string in the ‘input’ column of the Work Queue item. The beauty of this is that you can easily modify the dispatcher flow to include as much metadata as you like, without affecting the logic of anything. You can then use that metadata in subsequent steps, or logging etc.
When a new item is created in the Work Queue, it triggers the Performer cloud flow. The Performer cloud flow has trigger conditions:
This allows me to ensure that the Performer flow is only triggered when a new item is created that meets certain criteria. This is important if you have multiple Work Queues, and/or dispatchers and performers in your environment – if not implemented, you’ll have a mess of flows triggering based on the wrong event.
Once triggered, the Performer flow calls the relevant Power Apps Dataflow to refresh – and importantly – waits for it to finish refreshing. If you don’t have this wait-until-refresh-completes-step you would need to include a delay action, otherwise you will likely have issues where a Performer flow attempts to refresh a dataflow whilst it is already mid-refresh.
Here is the step within the Power Apps Dataflow which queries the Work Queue Items Dataverse table, filters based on the ID of the Work Queue, and for items with processing status.
Once I find the Work Queue Item I want, I use the json() function to extract the link to the SharePoint file.
It is worth noting here that the Dataflow gets upset if you do not trim the parameters from the URL.
Now that I have the URL to the SharePoint file, I run a separate query to extract and manipulate the data from the actual .csv file.
Finally, I load this into the Dataverse tables that I ultimately present to the user via a Model Driven App. Full script below for reference.
Hope this is helpful to someone! Would love to hear from anyone on how this approach could be improved or otherwise.
Dear Community Members, We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd. During this period, members will not be able to Kudo, Comment, or Reply to any posts. On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities. What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!
We are excited to announce the Summer of Solutions Challenge! This challenge is kicking off on Monday, June 17th and will run for (4) weeks. The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions in the Forums to as many questions as you can. Answers can be provided in all the communities. Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”) - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024 Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods. You must enter into each weekly Entry Period separately. How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different. Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers. At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing. Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner. A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random. Individuals will be contacted before the announcement with the opportunity to claim or deny the prize. Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners. Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event. ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE** Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber of SolutionsSuper UsersNumber of Solutions @anandm08 23 @WarrenBelz 31 @DBO_DV 10 @Amik 19 AmínAA 6 @mmbr1606 12 @rzuber 4 @happyume 7 @Giraldoj 3@ANB 6 (tie) @SpongYe 6 (tie) Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersSolutionsSuper UsersSolutions @anandm08 10@WarrenBelz 25 @DBO_DV 6@mmbr1606 14 @AmínAA 4 @Amik 12 @royg 3 @ANB 10 @AllanDeCastro 2 @SunilPashikanti 5 @Michaelfp 2 @FLMike 5 @eduardo_izzo 2 Meekou 2 @rzuber 2 @Velegandla 2 @PowerPlatform-P 2 @Micaiah 2 Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 3:Community MembersSolutionsSuper UsersSolutionsPower Apps anandm0861WarrenBelz86DBO_DV25Amik66Michaelfp13mmbr160647Giraldoj13FLMike31AmínAA13SpongYe27 Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 4:Community MembersSolutionsSuper UsersSolutionsPower Apps DBO-DV21WarranBelz26Giraldoj7mmbr160618Muzammmil_0695067Amik14samfawzi_acml6FLMike12tzuber6ANB8 SunilPashikanti8
On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners. The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights. Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.
We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for. We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer. Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together. Here's to new beginnings and endless possibilities! If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport. To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages