11-27-2021 16:55 PM - last edited 01-31-2024 08:53 AM
SharePoint Batch Update, Create, & Upsert Template
(Also an option for full dataset synchronizations with the "Full Sync" template below)
Compared to the using the basic Apply to Each, Get items, & Update item approach on large amounts of data, this method requires a small fraction of the API calls towards the daily action limit and drastically reduces flow run-times.
It's currently set to take data from any Excel sheet and update records in SharePoint with corresponding data. It works with any potential Power Automate data-source, including HTTP API GET requests, but Excel provides a simple example.
Part of it works like a Vlookup function where it identifies if a row of updated data in Excel or another datasource matches another SharePoint key column and gets the SharePoint record ID for that match. Then it uses the batch update method to update those SharePoint records and it uses the batch create method to create new items for any records without a match.
David Wyatt's Flow Optimization Post For Loading SharePoint Records: https://www.linkedin.com/pulse/top-5-ways-optimize-your-flows-david-wyatt/?trackingId=X9bMmnTZ2QBuu4...
Microsoft Batch API Documentation: https://learn.microsoft.com/en-us/sharepoint/dev/sp-add-ins/make-batch-requests-with-the-rest-apis
TachyTelic.Net Blog & Videos
SharePoint Batch Create Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-flow-batch-create-sharepoint-list-items/
Video: https://youtu.be/2dV7fI4GUYU
SharePoint Batch Delete Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/
Video: https://www.youtube.com/watch?v=2ImkuGpEeoo
Version 2.7 - Upsert
-Includes a batch create segment to create an upsert capability. If anyone wants to only update records, then they can remove the Batch Create section. If anyone wants to only create records, then they can go to the GenerateSPData action, remove the expression for the ID field and insert the null value expression.
-Further simplifies the set-up, removing the need to add any additional SharePoint Get items actions & removing the need for parallel branches.
-Can now work on lists with a few million items without adding more actions or branches. It also implements a faster load method using the SharePoint HTTP action as described in point 5 of this article.
-The batch loops have been changed from Do until loops to chunking into Apply to each loops so the batch actions can now run concurrently for additional speed. If you have many batches of data you want to process faster, you can try increasing the concurrency settings on the Apply to each loops containing the SendBatch actions.
-The "setting" inputs action was moved to the top of the flow to help accommodate the new streamlined set-up.
-A SP HTTP call now automatically fixes some issues with referencing the correct list name.
-Faster list load time.
-If you need to batch create &/or update hyperlink columns, check this post
-Adds another HTTP call to get the site users into an object indexed/reference-able by email addresses & gives an example of how to use that to batch update a person column. Anytime the updated source dataset has a blank or an email value not found in the top 5000 site users, it will replace any person in that item with a null value.
Updated set-up screenshots & instructions in this post: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List/m-p/2225500...
Full Sync V1.1 (Combined Upsert & Batch Deletion Sync)
The SharePoint Batch Full Sync template pulls in all the SP List & Source data to perform a batch upsert. But then it also adds on a Batch Deletion Sync to find & delete all the records in the SP List that are not in the Source dataset.
Now, this is initially set up to pull in all the datasource records for all batch actions in the same Do until Get source data loop. And that piece will be limited by the 100MB maximum message / variable size limits for Power Automate, so this Full Sync version will initially only work with datasources with a 100MB or less total size. But this is really only because I'm trying to keep the flow simpler for the majority of users who likely will not have datasources of many 100s of thousands of records.
If you want to further push out against this 100MB limitation, then you will need to separate out the source get data for the batch upsert section from another source get data for the batch deletion sync section. So for the batch upsert section you can use a set-up like in the main batch upsert template where it loads records with all columns 100,000 at a time (or 100 or 5000 or whatever your source dataset per load limitations are) and runs the batch upsert on each source load before running the Do until loop again to get the next source load (which avoids holding anywhere near 100MB in memory at once because it is performing things one load at a time). Then the batch deletion sync section can use a different source get data set-up similar to the "Do until Get destination list IDs + keys" section of the templates where each loop can pull a load from the source dataset & then use a Select action to select only a few of the columns to pass on to the variable holding everything in memory. Since deletions only require the primary key values, you can set the Select to only get the primary key column from each source data load & pass that onto the "Source data outputs" / variable. A full listing of all the primary key values in your source dataset will be much smaller than all columns for the entire table, so that 100MB limit should then hold a few million records worth of the required primary key data to run the batch deletion sync process.
Self Update (See the 1st comment below the main post for the zip download)
The SharePoint Self Batch Update assumes you just want to perform simple updates using only the existing data in the list and removes all the actions related to comparing two datasets to find updates. This may be much easier to use if you just want to quickly do something simple like get all the items created in the past month and mark them all with a Complete status.
But you will be limited to using just the data already in the list and any values you can manually input into the flow.
Version 1.5 - Update
This version makes it easier to handle cases where the list name may have changed since its creation and moves a few of the primary key column matching inputs to the 'settings' compose action so users don't have to look through & edit the more complicated expressions to set up the flow.
The flow can easily expand to any size of SharePoint list by adding more Get items actions and batch update parallel branches. If speed is a concern for anyone, there are ways to make the Get items actions all load in parallel too (up to 50 branches). It's really only limited by how much effort people want to put into their flows & lists.
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10p7EB730xsEj-azVYuTIuu8dS0w-AflR/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1BVGoeM5mykYlMobAyFkhuLRh3r7jMSLz/view?usp=sharing
Version 1 - Update
Version 1 Explanation Video: https://youtu.be/l0NuYtXdcrQ
Download The Template Batch Update Flow
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10gFkycdx6zpRfrI-s_jCDwIK6dpyyDqk/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1e6-INUykIT22ppVh5m4kxz8us_7qXy7q/view?usp=sharing
Formulas For Random Number Columns
SharePoint Rand1To50 Column Calculated Default Value Formula:
=INT(INT(RIGHT(NOW(),2))/2)
Excel Random Column Formula:
=ROUNDDOWN(((Rand()*100)+1)/2, 0)
If you have trouble importing any of the flows using the standard legacy import, you can also try importing a Power Apps Solutions package here: Re: Batch Update, Create, and Upsert SharePoint Li... - Page 25 - Power Platform Community (microsof...
Thanks for any feedback,
Please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86).
And reach out on LinkedIn (https://www.linkedin.com/in/kolota/) if you want to hire me to consult or build more custom Microsoft solutions for you.
watch?v=QCkjQy6sHZg
Yes so the CSV parsing would go around where the Excel action is & then you would insert the JSON array output of the CSV actions into the From of the GenerateSPData action. From there your similar logic for the key values should work once you input the expression & create/fill in the new key fields you’ve included in the Settings action.
Question though, do you have all the key values in a single column in SharePoint, or do you also need to adjust the expressions/logic for 5 key columns in SharePoint too?
I don't have it in one column on SP... but there are concated in this created ID ( with this expression ).
I need create this concated field.. on CSV site ... i think
@takolota If you have time , could you be pleas more hmm direct ?
I create column of this compound ID in array from CSV
So if you have that single column of StringID in with the CSV data, then you could use the original template, input StringID as the SourceKeyColumnName value in the Settings action, replace the Excel action with your actions to get the CSV data, & use the output of the CSV data actions in the From input for the GenerateSPData action.
To be any more specific, I’ll need to see your flow & what you have set up for this CSV.
Thank you @takolota, you explain that in the video and I forgot.
Other thing, it's possible to add Person fields in the the GenerateSPData, or should I create a flow to do that after?
If you don’t have a lot of data, then you can do it after each row at a time.
I have recently found the right SharePoint HTTP call to get the site’s User records. But I’m not sure if I can use those records or their claims or anything in the batch actions without more testing.
Hi @takolota , thanks so much for this post! I've been working on implementing it for the past couple of weeks, so far so good.
My use case is very similar: converting huge Excel files (~130K rows each) into SharePoint lists. I had to limit my batch size setting to 200 and the
Excel List rows present in a table Threshold to 5000 (and therefor the Top Count and Skip Count to 5000 as well, and the Do Until Upsert condition to <4999).
I’m creating the Excel tables and name them each time the flow runs, prior to the Batch Upsert scope (with a constant Table name 'Comp03_Tbl' for all files).
The error I’m currently getting seems random, from time to time the List rows present in a table step fails in one of the Do Until Upsert iterations claiming that the Excel table name does not exist. Obviously the table with that name is there when I go and check, and it has been there throughout the previous Do Until iterations.
{
"status": 404,
"message": "No table was found with the name 'Comp03_Tbl'.\r\nclientRequestId: ce7eade6-dbdb-478b-bec3-9c406743c4fa\r\nserviceRequestId: 95a28545-e676-493f-a296-7d73a5001c7f",
"error": {
"message": "No table was found with the name 'Comp03_Tbl'."
},
"source": "excelonline-eus.azconn-eus-004.p.azurewebsites.net"
}
I say it’s random because so far I wasn’t able to reproduce or predict the error, sometimes the flow runs perfectly without any errors, sometimes it happens with huge files and sometimes with smaller ones, I had cases when it failed on the 2nd iteration and cases when it failed on the 7th, but succeed to find the table name on all previous iterations, I’ve tried it with blank SharePoint lists (e.g. only Create) and with full ones, where only Batch Update was activated.
Any idea what could be causing the Excel connector to “lose” the table name?
@offirhal Hello,
@Yutao do you know why an Excel connector would “lose the table name”?
@offirhal I’ve not seen that issue before on any flow. But one way to reduce its occurrence may be to reduce the number of times the Excel connector needs to be called by increasing the amount of records you pull in each read. Is the decrease to 5,000 because of your license limitations? In that case it may be helpful to get a higher license that can do 100,000 records at a time as that would reduce your error rate 20x.
Otherwise the only work-around I can think of is to create a kind of more explicit retry, where you set a Scope after the Excel List rows with a Delay set to something like 20 seconds, then a secondary Excel List rows and set that scope with those actions to run whenever the Excel List rows fails, then set the actions after the Scope to run both when the Scope succeeds & when the Scope is skipped. Then you could include the dynamic content of the back-up List rows next to the dynamic content of the 1st List rows wherever it is used.
That way if the 1st List rows fails for any reason it will retry with the second & use the 2nd’s outputs.
@takolota Thanks for the quick reply!
Yes, as far as I understand there's a 5k pagination limit for cloud triggered flow (mine is no longer manually triggered), I'll try to ask around re a higher license, interesting idea.
If nothing else works, I'll try your workaround, but I'm hoping there's a simpler solution somewhere...
Without fully understanding @offirhal 's scenario and flow, it's a bit hard to tell what might be causing the flow not being able to find the table.
The only thing I have seen before that might be similar to this is when a flow contains a Run script action that creates a table and another Excel Online (Business) action (e.g., "Get tables" or "List rows present in a table") that tries to use that table in the same workbook. This could potentially introduce a race condition between the two actions. Sometimes the second action won't be able to see the table created by in the Run script action.
The mitigation is normally making the second action wait for few minutes before accessing the same workbook.
Another option is to implement the same operations previously done by other Excel Online (Business) actions also using the Run script action. We have seen quite many problems with flows that contain both the Run script action and other Excel Online (Business) actions interacting with the same workbook.