08-12-2022 21:24 PM - last edited 08-30-2022 16:49 PM
Generate efficient SQL batch queries for any CRUD operations in any flow or Power App.
This transforms Power Apps table data or flow action output JSON data into a type of CSV data. Then the main part of the flow efficiently transforms that CSV-like data into SQL queries that can Create, Read, Update, or Delete all the data within a single SQL Query action. There are no slow-running loops in the app or flow.
Power Apps Test Set-Up & Formula
Example Power Apps Formulas (Anytime I post a certain formula to the page, it breaks and won't let me edit the page anymore. You'll have to use the link or download the added zip file below.)
https://drive.google.com/file/d/1eInULGezXoaxSwsAiairyTBT6FEo9YkN/view?usp=sharing
(The part with the Power Apps JSON array string regex & conversion for the flow response was taken from this video:https://youtu.be/2xWiFkBf294
This can be replaced with the simpler built-in Parse JSON once that feature is in preview or generally available.)
Power Automate Flow Overview Picture
Example Run Update Section
Thanks for any feedback, & please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86)
You can download & import the example flow below.
You can also check out the template Power Apps premium custom connector template I created with the same batch processing here:
https://powerusers.microsoft.com/t5/Community-App-Samples/SQL-Batch-Create-Read-Update-and-Delete-Cu...
This uses some custom connector API calls & Logic Apps instead of Power Automate.
I forgot to put the extra single quotes around the string-based columns in the non Power Apps example Create CSV action. I also put all the main pieces into a Scope for easier copying between flows.
Can pass 2x amount of Power App table data.
Version 1.5
I didn't know the Power Apps JSON function doesn't require any looping to output a full JSON array from a collection. Efficiently forming a JSON array of the data in Power Apps let me change the CSV creation from the Power App back to the flow for that use-case. This simplifies the Power App code & standardizes how to do this for any Power App or flow. So users don't have to learn extra details depending on whether it is a Power App flow or not, it all uses basically the same set-up now where a Create CSV table action intakes a JSON array & forms the needed CSV type data for the rest of the flow. It should be easier to learn & use this same flow across several different use-cases now.
Can pass less table data from Power Apps, but is easier to learn & set up.
Power Apps is now releasing an experimental feature to more easily parse JSON:
Tutorial: https://youtu.be/FqfLiJDdC3Q
Once this is in preview or generally available, I can adjust this template & replace the more complex REGEX section.
Hi Takolota,
First of all thank you for your work on this flow, it is very cool. I have realized at the end of it that "Execute a SQL Query (V2)" is not available for on-prem servers. Do you know of an alternative to batch insert CSV or JSON data into SQL?
Thanks!
Is it possible to use an On-Premise Gateway and/or Power Automate Desktop to send the query?
https://docs.microsoft.com/en-us/power-automate/gateway-reference
Version 1.7
I unnecessarily added something to try to handle in-data comma errors and it actually caused an error in the batch create & batch delete. I removed this additional buggy expression because in-data comma errors should already be avoided in SQL through the single-quotes ' ' around each string-based value.
Then the Create CSV action creates some errors when it adds double quotes around any item with a comma in it, so I added some replace expressions to handle that in the DataLines action.
I have not made anything like that. And I’m actually not sure if there is even a technical possibility for a similar batch set up on Microsoft systems at this point.
If you’re curious about it, you could look for any batch document API call options for Azure Blob &/or for Dataverse.
And if you really really needed batch actions for document storage, then you could try setting something up to send the document base64 as strings to the dataset, then have any front-end or viewing mechanism convert the base 64 back to a document. But that seems needlessly complicated & the strings would be really long for the documents, so you would still likely have small batch sizes.