11-27-2021 16:55 PM - last edited 01-31-2024 08:53 AM
SharePoint Batch Update, Create, & Upsert Template
(Also an option for full dataset synchronizations with the "Full Sync" template below)
Compared to the using the basic Apply to Each, Get items, & Update item approach on large amounts of data, this method requires a small fraction of the API calls towards the daily action limit and drastically reduces flow run-times.
It's currently set to take data from any Excel sheet and update records in SharePoint with corresponding data. It works with any potential Power Automate data-source, including HTTP API GET requests, but Excel provides a simple example.
Part of it works like a Vlookup function where it identifies if a row of updated data in Excel or another datasource matches another SharePoint key column and gets the SharePoint record ID for that match. Then it uses the batch update method to update those SharePoint records and it uses the batch create method to create new items for any records without a match.
David Wyatt's Flow Optimization Post For Loading SharePoint Records: https://www.linkedin.com/pulse/top-5-ways-optimize-your-flows-david-wyatt/?trackingId=X9bMmnTZ2QBuu4...
Microsoft Batch API Documentation: https://learn.microsoft.com/en-us/sharepoint/dev/sp-add-ins/make-batch-requests-with-the-rest-apis
TachyTelic.Net Blog & Videos
SharePoint Batch Create Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-flow-batch-create-sharepoint-list-items/
Video: https://youtu.be/2dV7fI4GUYU
SharePoint Batch Delete Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/
Video: https://www.youtube.com/watch?v=2ImkuGpEeoo
Version 2.7 - Upsert
-Includes a batch create segment to create an upsert capability. If anyone wants to only update records, then they can remove the Batch Create section. If anyone wants to only create records, then they can go to the GenerateSPData action, remove the expression for the ID field and insert the null value expression.
-Further simplifies the set-up, removing the need to add any additional SharePoint Get items actions & removing the need for parallel branches.
-Can now work on lists with a few million items without adding more actions or branches. It also implements a faster load method using the SharePoint HTTP action as described in point 5 of this article.
-The batch loops have been changed from Do until loops to chunking into Apply to each loops so the batch actions can now run concurrently for additional speed. If you have many batches of data you want to process faster, you can try increasing the concurrency settings on the Apply to each loops containing the SendBatch actions.
-The "setting" inputs action was moved to the top of the flow to help accommodate the new streamlined set-up.
-A SP HTTP call now automatically fixes some issues with referencing the correct list name.
-Faster list load time.
-If you need to batch create &/or update hyperlink columns, check this post
-Adds another HTTP call to get the site users into an object indexed/reference-able by email addresses & gives an example of how to use that to batch update a person column. Anytime the updated source dataset has a blank or an email value not found in the top 5000 site users, it will replace any person in that item with a null value.
Updated set-up screenshots & instructions in this post: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List/m-p/2225500...
Full Sync V1.1 (Combined Upsert & Batch Deletion Sync)
The SharePoint Batch Full Sync template pulls in all the SP List & Source data to perform a batch upsert. But then it also adds on a Batch Deletion Sync to find & delete all the records in the SP List that are not in the Source dataset.
Now, this is initially set up to pull in all the datasource records for all batch actions in the same Do until Get source data loop. And that piece will be limited by the 100MB maximum message / variable size limits for Power Automate, so this Full Sync version will initially only work with datasources with a 100MB or less total size. But this is really only because I'm trying to keep the flow simpler for the majority of users who likely will not have datasources of many 100s of thousands of records.
If you want to further push out against this 100MB limitation, then you will need to separate out the source get data for the batch upsert section from another source get data for the batch deletion sync section. So for the batch upsert section you can use a set-up like in the main batch upsert template where it loads records with all columns 100,000 at a time (or 100 or 5000 or whatever your source dataset per load limitations are) and runs the batch upsert on each source load before running the Do until loop again to get the next source load (which avoids holding anywhere near 100MB in memory at once because it is performing things one load at a time). Then the batch deletion sync section can use a different source get data set-up similar to the "Do until Get destination list IDs + keys" section of the templates where each loop can pull a load from the source dataset & then use a Select action to select only a few of the columns to pass on to the variable holding everything in memory. Since deletions only require the primary key values, you can set the Select to only get the primary key column from each source data load & pass that onto the "Source data outputs" / variable. A full listing of all the primary key values in your source dataset will be much smaller than all columns for the entire table, so that 100MB limit should then hold a few million records worth of the required primary key data to run the batch deletion sync process.
Self Update (See the 1st comment below the main post for the zip download)
The SharePoint Self Batch Update assumes you just want to perform simple updates using only the existing data in the list and removes all the actions related to comparing two datasets to find updates. This may be much easier to use if you just want to quickly do something simple like get all the items created in the past month and mark them all with a Complete status.
But you will be limited to using just the data already in the list and any values you can manually input into the flow.
Version 1.5 - Update
This version makes it easier to handle cases where the list name may have changed since its creation and moves a few of the primary key column matching inputs to the 'settings' compose action so users don't have to look through & edit the more complicated expressions to set up the flow.
The flow can easily expand to any size of SharePoint list by adding more Get items actions and batch update parallel branches. If speed is a concern for anyone, there are ways to make the Get items actions all load in parallel too (up to 50 branches). It's really only limited by how much effort people want to put into their flows & lists.
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10p7EB730xsEj-azVYuTIuu8dS0w-AflR/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1BVGoeM5mykYlMobAyFkhuLRh3r7jMSLz/view?usp=sharing
Version 1 - Update
Version 1 Explanation Video: https://youtu.be/l0NuYtXdcrQ
Download The Template Batch Update Flow
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10gFkycdx6zpRfrI-s_jCDwIK6dpyyDqk/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1e6-INUykIT22ppVh5m4kxz8us_7qXy7q/view?usp=sharing
Formulas For Random Number Columns
SharePoint Rand1To50 Column Calculated Default Value Formula:
=INT(INT(RIGHT(NOW(),2))/2)
Excel Random Column Formula:
=ROUNDDOWN(((Rand()*100)+1)/2, 0)
If you have trouble importing any of the flows using the standard legacy import, you can also try importing a Power Apps Solutions package here: Re: Batch Update, Create, and Upsert SharePoint Li... - Page 25 - Power Platform Community (microsof...
Thanks for any feedback,
Please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86).
And reach out on LinkedIn (https://www.linkedin.com/in/kolota/) if you want to hire me to consult or build more custom Microsoft solutions for you.
watch?v=QCkjQy6sHZg
Hi.
I am trying to evolve on this flow and basically keep our Sharepoint Lists updated 100% to the data in Dataverse, this includes deleting, creating and updating any changes.
However, I am having difficulties wrapping my head around getting it to remove any items from Sharepoint that is no longer in Dataverse.
Can you point me in the right direction on how to achieve this?
I’m guessing you already know of @Paulie78’s SharePoint batch delete resource.
https://www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/?amp#h-flow-detail
And if the dataset was small enough to just delete each SharePoint record each time a Data-verse record was deleted, then you probably would have used that method.
So I’m taking this as you need a recurring batch deletion synchronization.
For that you can probably take a few things from my Excel Batch Delete V2B file from this page: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Delete/td-p/1634375
It uses several actions with Union( ) & Intersect( ) expressions to get to only the records existing in one dataset that are not in another dataset.
Unless @Paulie78 knows some simpler method.
This is an awesome post. Got it working using an on-prem SQL instead of Excel file and works as expected. However found some limitations or issues that had to workaround:
1. Not sure how you are able to read 100,000 records at a time . I get the threshold error. "The attempted operation is prohibited because it exceeds the list view threshold." My list is under 15K so I set the Top Count to 5000 , Pagination to 5000 , modify the "do until update" and added extra threads. How did you do get items of 100K at once ?
2. I had to lower the batch size under 1000 because I hit the another limitation : "The maximum number of bytes allowed to be read from the stream has been exceeded. After the last read operation, a total of 1063857 bytes has been read from the stream however a maximum of 1048576 bytes is allowed". I was updating several more columns than you example.
Cheers
1. I think it’s a licensing thing. I’m usually working with an E3, E5, or premium per user license.
2. If I had to guess you either have a large number of columns / fields in each row, you have some multiline text columns with a lot of characters in them, or some mix of those. Yes, reducing the batch size should help in that case.
Good job taking the initiative & figuring out whatever you could on your own.
First, thank you for sharing. I used your flow and it works as intended.
Would you respond to the following issue as well?
I'm extracting a table from SQL where the ID column is listed as a string instead of an integer (example: d3f9dff6-7625-4f23-b1bc-3a9652a5f866)
The flow throws an error when i use this column for an ID as the HTTP request URL line is not formatted correctly.
Error:
thank you for the quick response.
- i tried the string() method and it did not work. the URI was still the same without the apostrophe.
- i changed the 'batchTemplate' code to add apostrophes around the |ID| and it did create the following
PATCH https://<YOURWEBSITE >.sharepoint.com/sites/zTest_for_support/_api/web/lists/getByTitle('2testdatabase')/items('5D8FC382-4177-4571-9DB4-000FE75BC572') HTTP/1.1
however, now i get the following error:
"value": "Input string was not in a correct format."
- I also tried to use base64() for the dynamic ID code listed under 'Select Keys from Get Item 1' step.
Any thoughts? I am under the impression that for non-integer ID columns I just need to add the apostrophe around the string.
Hi @novice1508,
If that's coming from SharePoint, then ID's are integers, not GUID's. GUID's are more for database environments like Dataverse.
If ever you try to fetch a specific record from your SharePoint table that holds this GUID, then you should use filter predicates instead. For example:
https://<YOURWEBSITE >.sharepoint.com/sites/zTest_for_support/_api/web/lists/getByTitle('2testdatabase')/items?$filter=myGUIDcol eq '5D8FC382-4177-4571-9DB4-000FE75BC572'
My 2 cents
hi @Fred_S,
thank you for the response.
I tried your suggestion which sounds great. I notice new errors because it creates an array in the URL section.
I tried the following:
- /items?$filter=
- /items?$top=1&$filter=
- /items?$select=<column_name>&$top=1
again thank you for the suggestion.
I Finally realized how to work the original flow:
in the GenerateSPData step I just needed to choose the ID number from the destination database list
I originally confused this filed with the SharePointKeyColumnName
thank you both for responding, and thank you for sharing a fantastic flow.