09-07-2023 07:47 AM
SharePoint Batch Update, Create, & Upsert Template
(Also an option for full dataset synchronizations with the "Full Sync" template below)
Compared to the using the basic Apply to Each, Get items, & Update item approach on large amounts of data, this method requires a small fraction of the API calls towards the daily action limit and drastically reduces flow run-times.
It's currently set to take data from any Excel sheet and update records in SharePoint with corresponding data. It works with any potential Power Automate data-source, including HTTP API GET requests, but Excel provides a simple example.
Part of it works like a Vlookup function where it identifies if a row of updated data in Excel or another datasource matches another SharePoint key column and gets the SharePoint record ID for that match. Then it uses the batch update method to update those SharePoint records and it uses the batch create method to create new items for any records without a match.
David Wyatt's Flow Optimization Post For Loading SharePoint Records: https://www.linkedin.com/pulse/top-5-ways-optimize-your-flows-david-wyatt/?trackingId=X9bMmnTZ2QBuu4...
Microsoft Batch API Documentation: https://learn.microsoft.com/en-us/sharepoint/dev/sp-add-ins/make-batch-requests-with-the-rest-apis
TachyTelic.Net Blog & Videos
SharePoint Batch Create Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-flow-batch-create-sharepoint-list-items/
Video: https://youtu.be/2dV7fI4GUYU
SharePoint Batch Delete Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/
Video: https://www.youtube.com/watch?v=2ImkuGpEeoo
Version 2.7 - Upsert
-Includes a batch create segment to create an upsert capability. If anyone wants to only update records, then they can remove the Batch Create section. If anyone wants to only create records, then they can go to the GenerateSPData action, remove the expression for the ID field and insert the null value expression.
-Further simplifies the set-up, removing the need to add any additional SharePoint Get items actions & removing the need for parallel branches.
-Can now work on lists with a few million items without adding more actions or branches. It also implements a faster load method using the SharePoint HTTP action as described in point 5 of this article.
-The batch loops have been changed from Do until loops to chunking into Apply to each loops so the batch actions can now run concurrently for additional speed. If you have many batches of data you want to process faster, you can try increasing the concurrency settings on the Apply to each loops containing the SendBatch actions.
-The "setting" inputs action was moved to the top of the flow to help accommodate the new streamlined set-up.
-A SP HTTP call now automatically fixes some issues with referencing the correct list name.
-Faster list load time.
-If you need to batch create &/or update hyperlink columns, check this post
-Adds another HTTP call to get the site users into an object indexed/reference-able by email addresses & gives an example of how to use that to batch update a person column. Anytime the updated source dataset has a blank or an email value not found in the top 5000 site users, it will replace any person in that item with a null value.
Updated set-up screenshots & instructions in this post: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List/m-p/2225500...
Full Sync V1.1 (Combined Upsert & Batch Deletion Sync)
The SharePoint Batch Full Sync template pulls in all the SP List & Source data to perform a batch upsert. But then it also adds on a Batch Deletion Sync to find & delete all the records in the SP List that are not in the Source dataset.
Now, this is initially set up to pull in all the datasource records for all batch actions in the same Do until Get source data loop. And that piece will be limited by the 100MB maximum message / variable size limits for Power Automate, so this Full Sync version will initially only work with datasources with a 100MB or less total size. But this is really only because I'm trying to keep the flow simpler for the majority of users who likely will not have datasources of many 100s of thousands of records.
If you want to further push out against this 100MB limitation, then you will need to separate out the source get data for the batch upsert section from another source get data for the batch deletion sync section. So for the batch upsert section you can use a set-up like in the main batch upsert template where it loads records with all columns 100,000 at a time (or 100 or 5000 or whatever your source dataset per load limitations are) and runs the batch upsert on each source load before running the Do until loop again to get the next source load (which avoids holding anywhere near 100MB in memory at once because it is performing things one load at a time). Then the batch deletion sync section can use a different source get data set-up similar to the "Do until Get destination list IDs + keys" section of the templates where each loop can pull a load from the source dataset & then use a Select action to select only a few of the columns to pass on to the variable holding everything in memory. Since deletions only require the primary key values, you can set the Select to only get the primary key column from each source data load & pass that onto the "Source data outputs" / variable. A full listing of all the primary key values in your source dataset will be much smaller than all columns for the entire table, so that 100MB limit should then hold a few million records worth of the required primary key data to run the batch deletion sync process.
Self Update (See the 1st comment below the main post for the zip download)
The SharePoint Self Batch Update assumes you just want to perform simple updates using only the existing data in the list and removes all the actions related to comparing two datasets to find updates. This may be much easier to use if you just want to quickly do something simple like get all the items created in the past month and mark them all with a Complete status.
But you will be limited to using just the data already in the list and any values you can manually input into the flow.
Version 1.5 - Update
This version makes it easier to handle cases where the list name may have changed since its creation and moves a few of the primary key column matching inputs to the 'settings' compose action so users don't have to look through & edit the more complicated expressions to set up the flow.
The flow can easily expand to any size of SharePoint list by adding more Get items actions and batch update parallel branches. If speed is a concern for anyone, there are ways to make the Get items actions all load in parallel too (up to 50 branches). It's really only limited by how much effort people want to put into their flows & lists.
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10p7EB730xsEj-azVYuTIuu8dS0w-AflR/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1BVGoeM5mykYlMobAyFkhuLRh3r7jMSLz/view?usp=sharing
Version 1 - Update
Version 1 Explanation Video: https://youtu.be/l0NuYtXdcrQ
Download The Template Batch Update Flow
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10gFkycdx6zpRfrI-s_jCDwIK6dpyyDqk/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1e6-INUykIT22ppVh5m4kxz8us_7qXy7q/view?usp=sharing
Formulas For Random Number Columns
SharePoint Rand1To50 Column Calculated Default Value Formula:
=INT(INT(RIGHT(NOW(),2))/2)
Excel Random Column Formula:
=ROUNDDOWN(((Rand()*100)+1)/2, 0)
If you have trouble importing any of the flows using the standard legacy import, you can also try importing a Power Apps Solutions package here: Re: Batch Update, Create, and Upsert SharePoint Li... - Page 25 - Power Platform Community (microsof...
Thanks for any feedback,
Please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86).
And reach out on LinkedIn (https://www.linkedin.com/in/kolota/) if you want to hire me to consult or build more custom Microsoft solutions for you.
watch?v=QCkjQy6sHZg
Thanks again @takolota , I am now trying to implement your suggested workaround, with a second Excel list rows.
One thing I'm not clear of, what do you mean by "include the dynamic content of the back-up List rows next to the dynamic content of the 1st List rows wherever it is used"?
is there's a clean way to pull the the Excel content only from the active Excel connection, or do I have to set up conditions to query it existence?
attached a screenshot of how I'm using it in the GenerateSPData part, the value is clearly coming from my first (original) Excel output.
@offirhal The cleanest way would probably be to include a Compose after both Excel actions & input an expression that checks if the 2nd excel output is not empty then use output1 else use output 2
if(not(empty(InsertOutput2), InsertOutput1, InsertOutput2)
that way that Compose will hold the correct values & you can reference that Compose wherever you need to.
Hi again,
I'm getting an error during the GenerateSPData step. The expression for the ID key value is failing as apparently the source key is not an available property. In fact there are none, as it says "available properties are ''."
Any ideas?
Thanks
@woolie
Some advice when creating SharePoint list columns, don't ever put any special characters in the name when you are 1st creating the column. You can save the column without special characters 1st and then go back & rename them with the special characters, that way you don't run into common issues like this because SharePoint is escaping your characters in the backend name.
The problem isn't the expression it is that you are not selecting the correct name for the columns because you actually need to replace the brackets [ with _x005b_ and ] with _x005d_
See here where I added a column with [ ] around the text SpecialChar...
The square brackets are only in the source data columns; the Power BI query which I'm using to get the data returns each as Table['Column'], whereas the SharePoint list is just Column. So will replacing the escape characters still fix this?
I do however have one column with an & symbol in it.
Thanks
@woolie Hm, I’m not as familiar working with PBI columns. What do you have in the From input for the GenerateSPData action?
You may want to use a Parse JSON action on the PBI data & then see what the references look like from that action as it should have the valid references.
body('Run_a_query_against_a_dataset')?['firstTableRows']
The JSON has this form:
{"from":[{"Table[Col1]":"Val11","Table[Col2]":"Val12","Table[Col3]":"Val13"},{"Table[Col1]":"Val21","Table[Col2]":"Val22","Table[Col3]":"Val23"},...}]}
However I have noticed there is a blank element at the very end.
I seem to have fixed it, there was an argument in the action to query Power BI which was excluding any rows with nulls. Including these means the flow runs past the GenerateSPData action.
However, I'm now getting a different error from the Batch Create section, at the SendBatch 2 step:
The maximum number of bytes allowed to be read from the stream has been exceeded. After the last read operation, a total of 1063889 bytes has been read from the stream; however a maximum of 1048576 bytes is allowed.
clientRequestId: 91b8bdb3-0954-498a-9a90-679fee8e7dca
serviceRequestId: 3028d8a0-d0b4-7000-296c-edc7006f95d5
I tried reducing the batch size in the settings, however I'm now getting the hard-coded error message after the final condition to check the SendBatch actions.
That means at least one row of the data encountered an http error on either the update or create and failed to update or create.
You may want to go through the loops to find which Append to variable got an input text, grab that text & control f search it for the different error numbers to see what error it was.