11-27-2021 16:55 PM - last edited 01-31-2024 08:53 AM
SharePoint Batch Update, Create, & Upsert Template
(Also an option for full dataset synchronizations with the "Full Sync" template below)
Compared to the using the basic Apply to Each, Get items, & Update item approach on large amounts of data, this method requires a small fraction of the API calls towards the daily action limit and drastically reduces flow run-times.
It's currently set to take data from any Excel sheet and update records in SharePoint with corresponding data. It works with any potential Power Automate data-source, including HTTP API GET requests, but Excel provides a simple example.
Part of it works like a Vlookup function where it identifies if a row of updated data in Excel or another datasource matches another SharePoint key column and gets the SharePoint record ID for that match. Then it uses the batch update method to update those SharePoint records and it uses the batch create method to create new items for any records without a match.
David Wyatt's Flow Optimization Post For Loading SharePoint Records: https://www.linkedin.com/pulse/top-5-ways-optimize-your-flows-david-wyatt/?trackingId=X9bMmnTZ2QBuu4...
Microsoft Batch API Documentation: https://learn.microsoft.com/en-us/sharepoint/dev/sp-add-ins/make-batch-requests-with-the-rest-apis
TachyTelic.Net Blog & Videos
SharePoint Batch Create Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-flow-batch-create-sharepoint-list-items/
Video: https://youtu.be/2dV7fI4GUYU
SharePoint Batch Delete Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/
Video: https://www.youtube.com/watch?v=2ImkuGpEeoo
Version 2.7 - Upsert
-Includes a batch create segment to create an upsert capability. If anyone wants to only update records, then they can remove the Batch Create section. If anyone wants to only create records, then they can go to the GenerateSPData action, remove the expression for the ID field and insert the null value expression.
-Further simplifies the set-up, removing the need to add any additional SharePoint Get items actions & removing the need for parallel branches.
-Can now work on lists with a few million items without adding more actions or branches. It also implements a faster load method using the SharePoint HTTP action as described in point 5 of this article.
-The batch loops have been changed from Do until loops to chunking into Apply to each loops so the batch actions can now run concurrently for additional speed. If you have many batches of data you want to process faster, you can try increasing the concurrency settings on the Apply to each loops containing the SendBatch actions.
-The "setting" inputs action was moved to the top of the flow to help accommodate the new streamlined set-up.
-A SP HTTP call now automatically fixes some issues with referencing the correct list name.
-Faster list load time.
-If you need to batch create &/or update hyperlink columns, check this post
-Adds another HTTP call to get the site users into an object indexed/reference-able by email addresses & gives an example of how to use that to batch update a person column. Anytime the updated source dataset has a blank or an email value not found in the top 5000 site users, it will replace any person in that item with a null value.
Updated set-up screenshots & instructions in this post: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List/m-p/2225500...
Full Sync V1.1 (Combined Upsert & Batch Deletion Sync)
The SharePoint Batch Full Sync template pulls in all the SP List & Source data to perform a batch upsert. But then it also adds on a Batch Deletion Sync to find & delete all the records in the SP List that are not in the Source dataset.
Now, this is initially set up to pull in all the datasource records for all batch actions in the same Do until Get source data loop. And that piece will be limited by the 100MB maximum message / variable size limits for Power Automate, so this Full Sync version will initially only work with datasources with a 100MB or less total size. But this is really only because I'm trying to keep the flow simpler for the majority of users who likely will not have datasources of many 100s of thousands of records.
If you want to further push out against this 100MB limitation, then you will need to separate out the source get data for the batch upsert section from another source get data for the batch deletion sync section. So for the batch upsert section you can use a set-up like in the main batch upsert template where it loads records with all columns 100,000 at a time (or 100 or 5000 or whatever your source dataset per load limitations are) and runs the batch upsert on each source load before running the Do until loop again to get the next source load (which avoids holding anywhere near 100MB in memory at once because it is performing things one load at a time). Then the batch deletion sync section can use a different source get data set-up similar to the "Do until Get destination list IDs + keys" section of the templates where each loop can pull a load from the source dataset & then use a Select action to select only a few of the columns to pass on to the variable holding everything in memory. Since deletions only require the primary key values, you can set the Select to only get the primary key column from each source data load & pass that onto the "Source data outputs" / variable. A full listing of all the primary key values in your source dataset will be much smaller than all columns for the entire table, so that 100MB limit should then hold a few million records worth of the required primary key data to run the batch deletion sync process.
Self Update (See the 1st comment below the main post for the zip download)
The SharePoint Self Batch Update assumes you just want to perform simple updates using only the existing data in the list and removes all the actions related to comparing two datasets to find updates. This may be much easier to use if you just want to quickly do something simple like get all the items created in the past month and mark them all with a Complete status.
But you will be limited to using just the data already in the list and any values you can manually input into the flow.
Version 1.5 - Update
This version makes it easier to handle cases where the list name may have changed since its creation and moves a few of the primary key column matching inputs to the 'settings' compose action so users don't have to look through & edit the more complicated expressions to set up the flow.
The flow can easily expand to any size of SharePoint list by adding more Get items actions and batch update parallel branches. If speed is a concern for anyone, there are ways to make the Get items actions all load in parallel too (up to 50 branches). It's really only limited by how much effort people want to put into their flows & lists.
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10p7EB730xsEj-azVYuTIuu8dS0w-AflR/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1BVGoeM5mykYlMobAyFkhuLRh3r7jMSLz/view?usp=sharing
Version 1 - Update
Version 1 Explanation Video: https://youtu.be/l0NuYtXdcrQ
Download The Template Batch Update Flow
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10gFkycdx6zpRfrI-s_jCDwIK6dpyyDqk/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1e6-INUykIT22ppVh5m4kxz8us_7qXy7q/view?usp=sharing
Formulas For Random Number Columns
SharePoint Rand1To50 Column Calculated Default Value Formula:
=INT(INT(RIGHT(NOW(),2))/2)
Excel Random Column Formula:
=ROUNDDOWN(((Rand()*100)+1)/2, 0)
If you have trouble importing any of the flows using the standard legacy import, you can also try importing a Power Apps Solutions package here: Re: Batch Update, Create, and Upsert SharePoint Li... - Page 25 - Power Platform Community (microsof...
Thanks for any feedback,
Please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86).
And reach out on LinkedIn (https://www.linkedin.com/in/kolota/) if you want to hire me to consult or build more custom Microsoft solutions for you.
watch?v=QCkjQy6sHZg
I was able to change the back-end name and it will move to the next steps but I still get the following error:
One of the SendBatch actions encountered one of the following HTTP errors while processing the data.
400 Bad Request
401 Unauthorized
403 Forbidden
404 Not Found
408 Request Timeout
409 Conflict
502 Bad Gateway
You can visit (https://developer.mozilla.org/en-US/docs/Web/HTTP/Status) for more information.
This is the original error I was getting yesterday. The back-end name for the ID column is ID. For Extension column is Tittle. still get this error at the end. It does look like its retrieving the data. Dont know why it gives an error using ID as the source Key column. It is the same Im using in the other flow with no issue.
Can you please go to the Variable results or SendBatch action outputs & post the outputs with all the error lines so I can get more info on the specific error it is running into?
I figured it out... from the SP HTTP Get Items seems to be concatenating ID column plus whatever I use as the key column.. originally was doing IDID then IDExtention(Title). Since my excel file also have a matching ID column with the numbers, I wanted to use that, but if I only used Extension it would not match. I removed the firs "ID" from the GetItems action and also from Select IDs + Keys. I can see that it now gets the data and send it up.
This is how it looks now:
The other thing was the SPData.. I had to change the mapped column name to what the back-end have. Im still confused as to why is that since the other version works fine with the actual name. See below.
Old One:
New One:
One last comment/question: It does not seem to let me change the trigger to a recurrence due to pagination limit of 5000... should I just lower the count from 100000 to 5000? I need it to run daily in the morning. If I do a manual trigger, it will let me save it with the 100K pagination option.
Thanks so much for all of this. I know I ask too many questions but this is really helping my department. Is there any way I can show my appreciation?
@javijuan
It's not concatenating ID, there is a comma there. It is Selecting which columns to pull from the list, one of them is ID, the other is the PrimaryKey column you list in the settings, and confusingly enough there is a duplicate Id column already in there. So adding another ID column to SharePoint just makes things more confusing. I'd recommend naming anything like that "Identifier" in the future to avoid these issues.
The goal is to create the single JSON array output with just the SharePoint built-in number ID and the Primary Key value so that the later GenerateSPData action can reference that single JSON array using the corresponding key from the other source dataset without needing to keep a copy of the SharePoint ID numbers in the other dataset because for things like APIs & other datasets where one does not have as much control in their fields, one will not be able to add a column to track the SharePoint IDs and will need to rely on the key values instead, in many cases a good example is a unique email address for every contact in scenarios where one is working with contacts data.
What you are instead doing just inserting the copy of the SharePoint ID number may work for Excel, but at that point you may not even need the 1st half of this flow which is all built around forming that reference-able JSON array of the existing data & IDs.
Now if you can't use something with 100,000 pagination, then you can go to the Settings of the Excel action & set the pagination to 5000, but you will also want to set the Top Count to 5000, & adjust the expression for the Skip count to 5000, and set the value for the Do Until loop conditional to something like 4990 to ensure everything works as expected.
This is a great and truly life saving article!
is there something like quick edit for this?
I need to update rows in the SP list that are not in Excel but already exist in the SP list... mark these items as invalid...(revers logic of what is already there)
I tried a couple of times playing with conditions but it seems it's not be so easy I was thinking... just edit some conditions.
Thanks!
Self Update
The SharePoint Self Batch Update assumes you just want to perform simple updates using only the existing data in the list and removes all the actions related to comparing two datasets to find updates. This may be much easier to use if you just want to quickly do something simple like get all the items created in the past month and mark them all with a Complete status.
But you will be limited to using just the data already in the list and any values you can manually input into the flow.
@ctpmthh
The Self Update version I posted above should be a simpler starting-point for what you need.
However, you will still need to create something that identifies all the records in the list that are not in Excel. For that you likely need to replicate some of the logic I built in my Excel Batch Delete template here: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Delete/td-p/1634375
Instead of finding all the records in Excel that are not in the datasource, you will need to set it up to find all the records in a SP List that are not in the datasource. Then you can use the output of those values in the From input of the GenerateSPData action & add the Map rows to mark them as invalid.
Thanks,
already its works..
I just create 2 arrays ( SP and CSV )... use Filter to filtred What is in SP and not in CSV.. and then I grab this metadata column needed for batch... But is not optimal... I loop whole rows to create arrays ( stored in variable) and then use this filter function...
Thanks !
Full Sync V1 (Combined Upsert & Batch Deletion Sync)
The SharePoint Batch Full Sync template pulls in all the SP List & Source data to perform a batch upsert. But then it also adds on a Batch Deletion Sync to find & delete all the records in the SP List that are not in the Source dataset.
Now, this is initially set up to pull in all the datasource records for all batch actions in the same Do until Get source data loop. And that piece will be limited by the 100MB maximum message / variable size limits for Power Automate, so this Full Sync version will initially only work with datasources with a 100MB or less total size. But this is really only because I'm trying to keep the flow simpler for the majority of users who likely will not have datasources of many 100s of thousands of records.
If you want to further push out against this 100MB limitation, then you will need to separate out the source get data for the batch upsert section from another source get data for the batch deletion sync section. So for the batch upsert section you can use a set-up like in the main batch upsert template where it loads records with all columns 100,000 at a time (or 100 or 5000 or whatever your source dataset per load limitations are) and runs the batch upsert on each source load before running the Do until loop again to get the next source load (which avoids holding anywhere near 100MB in memory at once because it is performing things one load at a time). Then the batch deletion sync section can use a different source get data set-up similar to the "Do until Get destination list IDs + keys" section of the templates where each loop can pull a load from the source dataset & then use a Select action to select only a few of the columns to pass on to the variable holding everything in memory. Since deletions only require the primary key values, you can set the Select to only get the primary key column from each source data load & pass that onto the "Source data outputs" / variable. A full listing of all the primary key values in your source dataset will be much smaller than all columns for the entire table, so that 100MB limit should then hold a few million records worth of the required primary key data to run the batch deletion sync process.