05-10-2022 09:50 AM
SharePoint Batch Update, Create, & Upsert Template
(Also an option for full dataset synchronizations with the "Full Sync" template below)
Compared to the using the basic Apply to Each, Get items, & Update item approach on large amounts of data, this method requires a small fraction of the API calls towards the daily action limit and drastically reduces flow run-times.
It's currently set to take data from any Excel sheet and update records in SharePoint with corresponding data. It works with any potential Power Automate data-source, including HTTP API GET requests, but Excel provides a simple example.
Part of it works like a Vlookup function where it identifies if a row of updated data in Excel or another datasource matches another SharePoint key column and gets the SharePoint record ID for that match. Then it uses the batch update method to update those SharePoint records and it uses the batch create method to create new items for any records without a match.
David Wyatt's Flow Optimization Post For Loading SharePoint Records: https://www.linkedin.com/pulse/top-5-ways-optimize-your-flows-david-wyatt/?trackingId=X9bMmnTZ2QBuu4...
Microsoft Batch API Documentation: https://learn.microsoft.com/en-us/sharepoint/dev/sp-add-ins/make-batch-requests-with-the-rest-apis
TachyTelic.Net Blog & Videos
SharePoint Batch Create Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-flow-batch-create-sharepoint-list-items/
Video: https://youtu.be/2dV7fI4GUYU
SharePoint Batch Delete Flow
Blog: https://www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/
Video: https://www.youtube.com/watch?v=2ImkuGpEeoo
Version 2.7 - Upsert
-Includes a batch create segment to create an upsert capability. If anyone wants to only update records, then they can remove the Batch Create section. If anyone wants to only create records, then they can go to the GenerateSPData action, remove the expression for the ID field and insert the null value expression.
-Further simplifies the set-up, removing the need to add any additional SharePoint Get items actions & removing the need for parallel branches.
-Can now work on lists with a few million items without adding more actions or branches. It also implements a faster load method using the SharePoint HTTP action as described in point 5 of this article.
-The batch loops have been changed from Do until loops to chunking into Apply to each loops so the batch actions can now run concurrently for additional speed. If you have many batches of data you want to process faster, you can try increasing the concurrency settings on the Apply to each loops containing the SendBatch actions.
-The "setting" inputs action was moved to the top of the flow to help accommodate the new streamlined set-up.
-A SP HTTP call now automatically fixes some issues with referencing the correct list name.
-Faster list load time.
-If you need to batch create &/or update hyperlink columns, check this post
-Adds another HTTP call to get the site users into an object indexed/reference-able by email addresses & gives an example of how to use that to batch update a person column. Anytime the updated source dataset has a blank or an email value not found in the top 5000 site users, it will replace any person in that item with a null value.
Updated set-up screenshots & instructions in this post: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List/m-p/2225500...
Full Sync V1.1 (Combined Upsert & Batch Deletion Sync)
The SharePoint Batch Full Sync template pulls in all the SP List & Source data to perform a batch upsert. But then it also adds on a Batch Deletion Sync to find & delete all the records in the SP List that are not in the Source dataset.
Now, this is initially set up to pull in all the datasource records for all batch actions in the same Do until Get source data loop. And that piece will be limited by the 100MB maximum message / variable size limits for Power Automate, so this Full Sync version will initially only work with datasources with a 100MB or less total size. But this is really only because I'm trying to keep the flow simpler for the majority of users who likely will not have datasources of many 100s of thousands of records.
If you want to further push out against this 100MB limitation, then you will need to separate out the source get data for the batch upsert section from another source get data for the batch deletion sync section. So for the batch upsert section you can use a set-up like in the main batch upsert template where it loads records with all columns 100,000 at a time (or 100 or 5000 or whatever your source dataset per load limitations are) and runs the batch upsert on each source load before running the Do until loop again to get the next source load (which avoids holding anywhere near 100MB in memory at once because it is performing things one load at a time). Then the batch deletion sync section can use a different source get data set-up similar to the "Do until Get destination list IDs + keys" section of the templates where each loop can pull a load from the source dataset & then use a Select action to select only a few of the columns to pass on to the variable holding everything in memory. Since deletions only require the primary key values, you can set the Select to only get the primary key column from each source data load & pass that onto the "Source data outputs" / variable. A full listing of all the primary key values in your source dataset will be much smaller than all columns for the entire table, so that 100MB limit should then hold a few million records worth of the required primary key data to run the batch deletion sync process.
Self Update (See the 1st comment below the main post for the zip download)
The SharePoint Self Batch Update assumes you just want to perform simple updates using only the existing data in the list and removes all the actions related to comparing two datasets to find updates. This may be much easier to use if you just want to quickly do something simple like get all the items created in the past month and mark them all with a Complete status.
But you will be limited to using just the data already in the list and any values you can manually input into the flow.
Version 1.5 - Update
This version makes it easier to handle cases where the list name may have changed since its creation and moves a few of the primary key column matching inputs to the 'settings' compose action so users don't have to look through & edit the more complicated expressions to set up the flow.
The flow can easily expand to any size of SharePoint list by adding more Get items actions and batch update parallel branches. If speed is a concern for anyone, there are ways to make the Get items actions all load in parallel too (up to 50 branches). It's really only limited by how much effort people want to put into their flows & lists.
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10p7EB730xsEj-azVYuTIuu8dS0w-AflR/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1BVGoeM5mykYlMobAyFkhuLRh3r7jMSLz/view?usp=sharing
Version 1 - Update
Version 1 Explanation Video: https://youtu.be/l0NuYtXdcrQ
Download The Template Batch Update Flow
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10gFkycdx6zpRfrI-s_jCDwIK6dpyyDqk/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1e6-INUykIT22ppVh5m4kxz8us_7qXy7q/view?usp=sharing
Formulas For Random Number Columns
SharePoint Rand1To50 Column Calculated Default Value Formula:
=INT(INT(RIGHT(NOW(),2))/2)
Excel Random Column Formula:
=ROUNDDOWN(((Rand()*100)+1)/2, 0)
If you have trouble importing any of the flows using the standard legacy import, you can also try importing a Power Apps Solutions package here: Re: Batch Update, Create, and Upsert SharePoint Li... - Page 25 - Power Platform Community (microsof...
Thanks for any feedback,
Please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86).
And reach out on LinkedIn (https://www.linkedin.com/in/kolota/) if you want to hire me to consult or build more custom Microsoft solutions for you.
watch?v=QCkjQy6sHZg
Self Update
The SharePoint Self Batch Update assumes you just want to perform simple updates using only the existing data in the list and removes all the actions related to comparing two datasets to find updates. This may be much easier to use if you just want to quickly do something simple like get all the items created in the past month and mark them all with a Complete status.
But you will be limited to using just the data already in the list and any values you can manually input into the flow.
If someone is looking for another way to build the Vlookup type of section that matches the SharePoint ID to the external data records, Paulie just released a video on using xPath for this: https://youtu.be/QSF6dNkSKSA
Looks like Paul did start working on some batch Dataverse actions a few months ago with batch delete. However, there is a bit more set-up in Azure to run the Dataverse API calls.
https://www.tachytelic.net/2021/11/power-automate-bulk-delete-dataverse/?subscribe=success#subscribe...
I'll look into this more and see what I can do for a batch update template for Dataverse.
Thank You So Much! This was very helpful and what I needed for a list of about 6000 items I need to update daily from SQL Server.
I'm not an IT person and just learning these flows for work, but I know enough to be dangerous. I wanted to provide some feedback for an issue I had on your flow as well as tachytelic. It took me hours trying to get a batch create to work so I gave up. Then I saw your batch update and the flow had some extra details that allowed me to figure out the issue of how this works, and got the flow working and I think it could help make it easier for others to use.
Issue: Sharepoint List Display Name vs Web Address Name
The list name from Settings is used in multiple places. However, the formatting has to be adjusted based on the location it is used. In the batchTemplate the SPList name needs to be regular display name. In the GenerateSPData _metadata the needs to be the web address name. OP tried to help the _metadatafield by adding some replace values for underscores, but that only works if the name never changed.
If a Sharepoint list is created with the initial name “InitialSharePointName” and then the display name is changed to “New Display Name”, then it will never call the correct webaddress since the webaddress doesn’t change from initial name just like column names.
Solution to first issue: Add a second row in the Settings and differentiate the sharepoint display name and the sharepoint webaddress name.
Original Settings Format:
{
"siteAddress": "https://OrgName.sharepoint.com/sites/SiteName/",
"listName": "ExampleListName",
"batchSize": 1000
}
New Settings Format:
{
"siteAddress": "https://OrgName.sharepoint.com/sites/SiteName/",
"SPListDisplayTitleName": "(Copy Sharepoint Name from List Settings)",
"SPListWebAddressName": "(Copy list name section from the Web Address section in List Settings",
"batchSize": 1000
}
After Changing the settings you have to go into where they are used and update the expressions.
I hope this helps some people because this flow really helped me with an issue. The only step after this is to make a batch update that only updates specific rows that need to be changed. (This is a unique issue because I'm having to use the SP List as a database for reasons and don't have triggers. I only can compare SP to my SQL and unable to get the list into SQL or write into the SQL.)
Quick Feedback for the Creator: When you compare a SPList to another data source, please differentiate the names in the example. I know we keep names same so it's easier to reference and dynamic doesn't matter, but when I'm trying to look through code of the expressions, I don't know what is being used. This would really help with the explanation of Generate SharePoint data. The "ID" column in the excel file was really confusing me. I couldn't tell if that was being referenced or was it from the SP. It would help for labeled columns in excel to be like Ex_ID, Ex_Date, Ex_Name, Ex_Email, Ex_Status and then in the sharepoint had the columns labeled SP_Date, SP_Title, SP_Email, SP_Status.
Thanks @wskinnermctc , I’m glad it helped, even if it took a little extra work.
Unfortunately I won’t have access to SharePoint again until I start my new job. But when I go to create any similar flows for Dataverse, I can incorporate some of that & update things here.
Also, it’s probably a good time to note this & any other batch action templates won’t readily work if your SharePoint list is on the organization homepage site because that also changes the format for the SiteAddress. I didn’t get around to figuring out how that might be formatted in the SharePoint batch API call yet. If anyone else does, please share that information in this thread.
As for narrowing down what rows you want to update, you should be able to use the filter queries in the 1st SharePoint actions to limit the rows it is checking in SharePoint for updates or filter on whatever new datasource action you use to limit the rows of new data it pulls in for updates. Of course if the new datasource is Excel, then there isn’t a nice built-in way to do complex queries, so you will have to use a Filter query action & replace the Excel action outputs in later actions with that Filter query outputs.
I personally did something similar to sync data with a 3rd party site where I filtered API calls from the 3rd party site to rows that needed to be batch created & rows that needed to be batch updated based on whether the rows were already in SharePoint or not.
Version 1.5 Now Available
This version makes it easier to handle cases where the list name may have changed since its creation and moves a few of the primary key column matching inputs to the 'settings' compose action so users don't have to look through & edit the more complicated expressions to set up the flow.
Now users can just list the primary key column name(s) for the SP list and for the updated datasource from the settings menu. If two columns make up the primary key, then the 2nd column name can be listed in the ...KeyColumnName2 inputs. If there is only one column for the primary keys, then leave the ...KeyColumnName2 inputs blank.
Also if the list name has changed, then the user can fill in the ListWebAddressName input so the flow can use the correct reference in later steps. If the list name did not change, then leave ListWebAddressName blank and the flow will use the listName in all the actions, just like version 1.
Thanks @wskinnermctc for the feedback!
Google Drive Link to Flow Zip Files: https://drive.google.com/file/d/10p7EB730xsEj-azVYuTIuu8dS0w-AflR/view?usp=sharing
Google Drive Link to Text File to a Scope Action Containing The Flow: https://drive.google.com/file/d/1BVGoeM5mykYlMobAyFkhuLRh3r7jMSLz/view?usp=sharing (Or see the comment below with code you can copy directly from this thread)
Version 1.5 Scope Code to Copy & Paste Into The "My clipboard" Section of a New Action
{"id":"e92739c5-307f-46dc-aad9-a7bc687fa96f","brandColor":"#8C3900","connectionReferences":{"shared_excelonlinebusiness_1":{"connection":{"id":"/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness/connections/512f9c45ec1947d386e56055c70ac7e2"}},"shared_sharepointonline_2":{"connection":{"id":"/providers/Microsoft.PowerApps/apis/shared_sharepointonline/connections/f2c62e9ce2fc415c8103a6205da5f9d3"}}},"connectorDisplayName":"Control","icon":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMzIiIGhlaWdodD0iMzIiIHZlcnNpb249IjEuMSIgdmlld0JveD0iMCAwIDMyIDMyIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPg0KIDxwYXRoIGQ9Im0wIDBoMzJ2MzJoLTMyeiIgZmlsbD0iIzhDMzkwMCIvPg0KIDxwYXRoIGQ9Im04IDEwaDE2djEyaC0xNnptMTUgMTF2LTEwaC0xNHYxMHptLTItOHY2aC0xMHYtNnptLTEgNXYtNGgtOHY0eiIgZmlsbD0iI2ZmZiIvPg0KPC9zdmc+DQo=","isTrigger":false,"operationName":"Batch_update_from_external_data","operationDefinition":{"type":"Scope","actions":{"Do_until_Update":{"type":"Until","expression":"@less(length(body('List_rows_present_in_a_table')?['value']), 99990)","limit":{"count":150,"timeout":"PT8H"},"actions":{"Scope_Batch_update_items_in_1st_100k_SP_records":{"type":"Scope","actions":{"Do_until":{"type":"Until","expression":"@less(length(body('Select')), sub(outputs('settings')['batchSize'], 5))","limit":{"count":150,"timeout":"PT7H"},"actions":{"Select":{"type":"Select","inputs":{"from":"@take(skip(body('Filter_array_Remove_items_without_a_key_to_ID_match_in_SharePoint'), mul(outputs('settings')['batchSize'], iterationIndexes('Do_until'))), outputs('settings')['batchSize'])","select":"@replace(replace(outputs('batchTemplate'),'|ID|', string(item()?['ID'])), '|RowData|', string(item()))"},"runAfter":{},"metadata":{"operationMetadataId":"d90bce05-ae93-482c-a536-bd36310988b6"}},"batchData":{"type":"Compose","inputs":"@join(body('Select'), decodeUriComponent('%0A'))","runAfter":{"Select":["Succeeded"]},"metadata":{"operationMetadataId":"37cd3c50-9877-4ec3-a32f-33290e7cf6cb"}},"SendBatch":{"type":"ApiConnection","inputs":{"host":{"connection":{"name":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$connections']['shared_sharepointonline_2']['connectionId']"}},"method":"post","body":{"method":"POST","uri":"/_api/$batch","headers":{"X-RequestDigest":"digest","Content-Type":"multipart/mixed;boundary=batch_@{actions('settings')?['trackedProperties']['batchGUID']}"},"body":"--batch_@{actions('settings')?['trackedProperties']['batchGUID']}\nContent-Type: multipart/mixed; boundary=\"changeset_@{actions('settings')?['trackedProperties']['changeSetGUID']}\"\nContent-Length: @{length(outputs('batchData'))}\nContent-Transfer-Encoding: binary\n\n@{outputs('batchData')}\n--changeset_@{actions('settings')?['trackedProperties']['changeSetGUID']}--\n\n--batch_@{actions('settings')?['trackedProperties']['batchGUID']}--"},"path":"/datasets/@{encodeURIComponent(encodeURIComponent(outputs('settings')['siteAddress']))}/httprequest","authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{"batchData":["Succeeded"]},"limit":{"timeout":"P1D"},"metadata":{"flowSystemMetadata":{"swaggerOperationId":"HttpRequest"},"operationMetadataId":"16e4dda3-3bad-46d0-a0ce-a919bada5f25"}},"Results":{"type":"Compose","inputs":"@base64ToString(body('sendBatch')['$content'])","runAfter":{"SendBatch":["Succeeded"]},"metadata":{"operationMetadataId":"fa9b68d1-10d2-4ca3-8a3a-05c3c14c2f54"}}},"runAfter":{"Filter_array_Remove_items_without_a_key_to_ID_match_in_SharePoint":["Succeeded"]},"metadata":{"operationMetadataId":"c2b0cb55-964e-4bd4-8484-fcf245bd3932"}},"Filter_array_Remove_items_without_a_key_to_ID_match_in_SharePoint":{"type":"Query","inputs":{"from":"@body('GenerateSPData')","where":"@not(equals(item()?['ID'], null))"},"runAfter":{"GenerateSPData":["Succeeded"]},"metadata":{"operationMetadataId":"1c5c3129-35c5-444d-9a4f-aad8f88a4446"}},"GenerateSPData":{"type":"Select","inputs":{"from":"@body('List_rows_present_in_a_table')?['value']","select":{"__metadata":"@json(concat('{\"type\":\"SP.Data.', if(empty(outputs('settings')['ListWebAddressName']), replace(replace(outputs('settings')['listName'], '_', '_x005f_'), ' ', '_x0020_'), replace(replace(outputs('settings')['ListWebAddressName'], '_', '_x005f_'), ' ', '_x0020_')), 'ListItem\"}'))","ID":"@outputs('Compose_Reformat_keys_to_single_JSON_object')?[concat(item()[outputs('settings')['NewDatasourceKeyColumnName']], if(empty(outputs('settings')['NewDatasourceKeyColumnName2']), '', item()[outputs('settings')['NewDatasourceKeyColumnName2']]))]?['ID']","Date":"@addDays('1899-12-30', int(item()['Date']), 'u')","Title":"@item()?['Name']","Email":"@item()?['Email']","Status":"@item()?['Status']"}},"runAfter":{"Compose_Reformat_keys_to_single_JSON_object":["Succeeded"]},"description":"In the ID row expression, replace item()?['Email'] with the column(s) in the updated data that make up the keys. Ex: replace it with concat(item()?['ExcelColumnName1'], item()?['ExcelColumnName2']) for a multi-column key.","metadata":{"operationMetadataId":"ce0487d5-63c8-40b4-9b4c-5229c0439c94"}},"Compose_Reformat_keys_to_single_JSON_object":{"type":"Compose","inputs":"@json(replace(replace(replace(string(body('Select_Keys_from_Get_items_1')), '[{', '{'), '}]', '}'), '},{', ','))","runAfter":{"Select_Keys_from_Get_items_1":["Succeeded"]},"metadata":{"operationMetadataId":"69b1d189-d56f-45fc-8dbb-a49740832a64"}},"Select_Keys_from_Get_items_1":{"type":"Select","inputs":{"from":"@outputs('Get_items_1st_100000')?['body/value']","select":{"@{concat(item()[outputs('settings')['SharePointKeyColumnName']], if(empty(outputs('settings')['SharePointKeyColumnName2']), '', item()[outputs('settings')['SharePointKeyColumnName2']]))}":"@item()"}},"runAfter":{},"description":"Action limit is 100k Get items records. Decide on 1+ columns to use on the left side as a unique key that is different for every record in the list. For a multiple column key list each, eg: item()?['Column1'], item()?['Column2'], item()?['Column3']).","metadata":{"operationMetadataId":"dac3de39-a3e5-422a-be37-81089ee64a51"}}},"runAfter":{"batchTemplate":["Succeeded"]},"metadata":{"operationMetadataId":"a32ffaf9-e457-4b81-92cf-531cf58ce133"}},"Scope_Batch_update_items_in_2nd_100k_SP_records":{"type":"Scope","actions":{"Do_until_2":{"type":"Until","expression":"@less(length(body('Select_2')), sub(outputs('settings')['batchSize'], 5))","limit":{"count":150,"timeout":"PT7H"},"actions":{"Select_2":{"type":"Select","inputs":{"from":"@take(skip(body('Filter_array_Remove_items_without_a_key_to_ID_match_in_SharePoint_2'), mul(outputs('settings')['batchSize'], iterationIndexes('Do_until_2'))), outputs('settings')['batchSize'])","select":"@replace(replace(outputs('batchTemplate'), '|ID|', string(item()?['ID'])), '|RowData|', string(item()))"},"runAfter":{},"metadata":{"operationMetadataId":"d90bce05-ae93-482c-a536-bd36310988b6"}},"batchData_2":{"type":"Compose","inputs":"@join(body('Select_2'), decodeUriComponent('%0A'))","runAfter":{"Select_2":["Succeeded"]},"metadata":{"operationMetadataId":"37cd3c50-9877-4ec3-a32f-33290e7cf6cb"}},"SendBatch_2":{"type":"ApiConnection","inputs":{"host":{"connection":{"name":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$connections']['shared_sharepointonline_2']['connectionId']"}},"method":"post","body":{"method":"POST","uri":"/_api/$batch","headers":{"X-RequestDigest":"digest","Content-Type":"multipart/mixed;boundary=batch_@{actions('settings')?['trackedProperties']['batchGUID']}"},"body":"--batch_@{actions('settings')?['trackedProperties']['batchGUID']}\nContent-Type: multipart/mixed; boundary=\"changeset_@{actions('settings')?['trackedProperties']['changeSetGUID']}\"\nContent-Length: @{length(outputs('batchData_2'))}\nContent-Transfer-Encoding: binary\n\n@{outputs('batchData_2')}\n--changeset_@{actions('settings')?['trackedProperties']['changeSetGUID']}--\n\n--batch_@{actions('settings')?['trackedProperties']['batchGUID']}--"},"path":"/datasets/@{encodeURIComponent(encodeURIComponent(outputs('settings')['siteAddress']))}/httprequest","authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{"batchData_2":["Succeeded"]},"limit":{"timeout":"P1D"},"metadata":{"flowSystemMetadata":{"swaggerOperationId":"HttpRequest"},"operationMetadataId":"16e4dda3-3bad-46d0-a0ce-a919bada5f25"}},"Results_2":{"type":"Compose","inputs":"@base64ToString(body('sendBatch_2')['$content'])","runAfter":{"SendBatch_2":["Succeeded"]},"description":"When copying a branch, change the 'sendBatch_#' part of the expression below to match the new SendBatch action above. Often it doesn't change the new copy's # in 'SendBatch_#'.","metadata":{"operationMetadataId":"fa9b68d1-10d2-4ca3-8a3a-05c3c14c2f54"}}},"runAfter":{"Filter_array_Remove_items_without_a_key_to_ID_match_in_SharePoint_2":["Succeeded"]},"metadata":{"operationMetadataId":"c2b0cb55-964e-4bd4-8484-fcf245bd3932"}},"Filter_array_Remove_items_without_a_key_to_ID_match_in_SharePoint_2":{"type":"Query","inputs":{"from":"@body('GenerateSPData_2')","where":"@not(equals(item()?['ID'], null))"},"runAfter":{"GenerateSPData_2":["Succeeded"]},"metadata":{"operationMetadataId":"1c5c3129-35c5-444d-9a4f-aad8f88a4446"}},"GenerateSPData_2":{"type":"Select","inputs":{"from":"@body('List_rows_present_in_a_table')?['value']","select":{"__metadata":"@json(concat('{\"type\":\"SP.Data.', if(empty(outputs('settings')['ListWebAddressName']), replace(replace(outputs('settings')['listName'], '_', '_x005f_'), ' ', '_x0020_'), replace(replace(outputs('settings')['ListWebAddressName'], '_', '_x005f_'), ' ', '_x0020_')), 'ListItem\"}'))","ID":"@outputs('Compose_Reformat_keys_to_single_JSON_object_2')?[concat(item()[outputs('settings')['NewDatasourceKeyColumnName']], if(empty(outputs('settings')['NewDatasourceKeyColumnName2']), '', item()[outputs('settings')['NewDatasourceKeyColumnName2']]))]?['ID']","Date":"@addDays('1899-12-30', int(item()['Date']), 'u')","Title":"@item()?['Name']","Email":"@item()?['Email']","Status":"@item()?['Status']"}},"runAfter":{"Compose_Reformat_keys_to_single_JSON_object_2":["Succeeded"]},"description":"In the ID row expression, replace item()?['Email'] with the column(s) in the updated data that make up the keys. Ex: replace it with concat(item()?['ExcelColumnName1'], item()?['ExcelColumnName2']) for a multi-column key.","metadata":{"operationMetadataId":"ce0487d5-63c8-40b4-9b4c-5229c0439c94"}},"Compose_Reformat_keys_to_single_JSON_object_2":{"type":"Compose","inputs":"@json(replace(replace(replace(string(body('Select_Keys_from_Get_items_2')), '[', ''), ']', ''), '},{', ','))","runAfter":{"Select_Keys_from_Get_items_2":["Succeeded"]},"metadata":{"operationMetadataId":"69b1d189-d56f-45fc-8dbb-a49740832a64"}},"Select_Keys_from_Get_items_2":{"type":"Select","inputs":{"from":"@outputs('Get_items_2nd_100000')?['body/value']","select":{"@{concat(item()[outputs('settings')['SharePointKeyColumnName']], if(empty(outputs('settings')['SharePointKeyColumnName2']), '', item()[outputs('settings')['SharePointKeyColumnName2']]))}":"@item()"}},"runAfter":{},"description":"When copying a branch, change the SharePoint value in the From field to the next Get items output that you added at the top of the flow.","metadata":{"operationMetadataId":"dac3de39-a3e5-422a-be37-81089ee64a51"}}},"runAfter":{"batchTemplate":["Succeeded"]},"metadata":{"operationMetadataId":"92a89bdf-e08b-4767-a93f-831f6b8eeb76"}},"batchTemplate":{"type":"Compose","inputs":"--changeset_@{actions('settings')?['trackedProperties']['changeSetGUID']}\nContent-Type: application/http\nContent-Transfer-Encoding: binary\n\nPATCH @{outputs('settings')['siteAddress']}_api/web/lists/getByTitle('@{outputs('settings')['listName']}')/items(|ID|) HTTP/1.1\nContent-Type: application/json;odata=verbose\nAccept: application/json;odata=verbose\nIf-Match: *\n\n|RowData|\n","runAfter":{"settings":["Succeeded"]},"metadata":{"operationMetadataId":"d0af0142-cd1d-437c-97a2-98210ea8dbde"}},"settings":{"type":"Compose","inputs":{"siteAddress":"https://OrgName.sharepoint.com/sites/SiteName/","listName":"ExampleListName","ListWebAddressName":"","batchSize":800,"SharePointKeyColumnName":"Email","SharePointKeyColumnName2":"","NewDatasourceKeyColumnName":"Email","NewDatasourceKeyColumnName2":""},"runAfter":{"List_rows_present_in_a_table":["Succeeded"]},"description":"List to batch update & the batch size (max 1000). Web address name if the list name changed. The column name(s) in SP & in the updated datasource to match in the lookup step (1st SP row /w matching key column value(s) is updated /w the new data).","trackedProperties":{"meta":{"type":"SP.Data.@{outputs('settings')?['listName']}ListItem"},"batchGUID":"@{guid()}","changeSetGUID":"@{guid()}"},"metadata":{"operationMetadataId":"459dc501-ad7e-4252-8540-df307a69607f"}},"List_rows_present_in_a_table":{"type":"ApiConnection","inputs":{"host":{"connection":{"name":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$connections']['shared_excelonlinebusiness_1']['connectionId']"}},"method":"get","path":"/drives/@{encodeURIComponent('b!_EMfBFNtWEOa3ak75AQ0TW2Q1OX85QRDkj8zbxn7udyJYnSyNkLtSYhgHecJU74Y')}/files/@{encodeURIComponent(encodeURIComponent('01RD23GUU5UPXK7LQNWBCITP653FJPSCXW'))}/tables/@{encodeURIComponent('{ED2ABACC-AB30-4ACC-AC19-42CE68FEB007}')}/items","queries":{"source":"me","$top":100000,"$skip":"@add(mul(iterationIndexes('Do_until_Update'), 99998), iterationIndexes('Do_until_Update'))"},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{},"description":"The expression in Skip Count will skip any 100k batches already loaded by previous loop runs. You may need to create something similar if you use any alternative get data action in this Do until loop.","runtimeConfiguration":{"paginationPolicy":{"minimumItemCount":100000}},"metadata":{"016B7R4Y7EON4XZ3K5URCJG7TCJZ2K7B6S":"/Power Automate Files/DateInfo.xlsx","flowSystemMetadata":{"swaggerOperationId":"GetItems"},"016B7R4YYVOTO4F52WHNAZ4NTGFWMYOH66":"/Power Automate Files/thanksgiving-metropolitan-ministries-holidays11-22.xlsx","operationMetadataId":"d3258554-7310-4453-9d1c-103d10112938","016B7R4Y3LVLCEITRXYVBIVHINFYOQURWY":"/Example_Folder/Example_File.xlsx","01RD23GUU5UPXK7LQNWBCITP653FJPSCXW":"/TestExcel.xlsx"}}},"runAfter":{},"description":"If you change the List rows present in a table to something else, change the length() expression below to the length of your new get data action & the 99990 to the new get data batch size. Then insert the action values in GenerateSPData From fields.","metadata":{"operationMetadataId":"f668d1f3-7245-4e88-9952-6b7e2bec05b9"}}},"runAfter":{"Get_items_2nd_100000":["Succeeded"]},"description":"Adjust the \"settings\" compose action to fit your data & datasources. Then adjust the GenerateSPData actions in each branch to include the right SharePoint column names & new updated data for your use-case.","metadata":{"operationMetadataId":"921a4fd4-4def-40df-a301-6402606fa56f"}}}
Someone was having trouble with Excel dates in their source data coming up as numbers even with the date ISO settings active on the Excel List table rows.
A reminder that you can change any Excel date number to its correct date using an expression in the GenerateSPData action.
https://www.tachytelic.net/2020/11/convert-excel-dates-power-automate/?amp
addDays('1899-12-30', int(item()['Date']), 'dd-MM-yyyy')
I’m also working on a Batch Update Excel if anyone is interested.
It uses Office Scripts instead of any premium HTTP action. It’s much faster than the standard Excel update options in Power Automate and uses a small fraction of action API calls on larger updates.
https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Excel/m-p/1624706
First, thank you for sharing this with us.
I'm trying to get started with your code based on the copy/paste method. But as soon as I paste the code, I get this strange behavior for the ID part in the 'Generate SPData' action.
I've also provided the data in the 'Settings' action, as you can see on the screenshot.
Can you give any guidance?