07-05-2023 02:37 AM - last edited 07-05-2023 02:39 AM
Update & Create Excel Records 50-100x Faster
I was able to develop an Office Script to update rows and an Office Scripts to create rows from Power Automate array data. So instead of a flow creating a new action API call for each individual row update or creation, this flow can just send an array of new data and the Office Scripts will match up primary key values, update each row it finds, then create the rows it doesn't find.
And these Scripts do not require manually entering or changing any column names in the Script code.
• In testing for batches of 1000 updates or creates, it's doing ~2000 row updates or creates per minute, 50x faster than the standard Excel create row or update row actions at max 50 concurrency. And it accomplished all the creates or updates with less than 4 actions or only .4% of the standard 1000 action API calls.
• The Run Script code for processing data has 2 modes, the Mode 2 batch method that saves & updates a new instance of the table before posting batches of table ranges back to Excel & the Mode 1 row by row update calling on the Excel table.
The Mode 2 script batch processing method will activate for updates on tables less than 1 million cells. It does encounter more errors with larger tables because it is loading & working with the entire table in memory.
Shoutout to Sudhi Ramamurthy for this great batch processing addition to the template!
Code Write-Up: https://docs.microsoft.com/en-us/office/dev/scripts/resources/samples/write-large-dataset
Video: https://youtu.be/BP9Kp0Ltj7U
The Mode 1 script row by row method will activate for Excel tables with more than 1 million cells. But it is still limited by batch file size so updates on larger tables will need to run with smaller cloud flow batch sizes of less than 1000 in a Do until loop.
The Mode 1 row by row method is also used when the ForceMode1Processing field is set to Yes.
Be aware that some characters in column names, like \ / - _ . : ; ( ) & $ may cause errors when processing the data. Also backslashes \ in the data, which are usually used to escape characters in strings, may cause errors when processing the JSON.
Version 7 Note
Diverting from what is shown in the video, I was able to remove almost all the flow actions in the "Match new and existing data key values then batch update" scope after replicating their functions in the scripts themselves. The flow now goes directly from the "SelectGenerateData" action to the "Run script Update Excel rows" action and the script handles matching up the UpdatedData JSON keys/field names to the destination table headers.
Also version 7 changes the primary key set up in the SelectGenerateData and changes the logic for skipping cell value updates & blanking out cell values.
Now the primary key column name from the destination table must be present in the SelectGenerateData action with the dynamic content for the values you want to match up to update. No more 'PrimaryKey' line, the update script will automatically reference the primary key column in the SelectGenerateData data based on the PrimaryKeyColumnName input on the update script action in the flow.
Now leaving a blank "" in the SelectGenerateData action will make that cell value in the Excel table empty, while leaving a null in the SelectGenerateData action will skip updating / not alter the cell value that currently exists in the Excel table there.
Version 6 which looks closer to the version shown in the video can still be accessed here: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/m-p...
Version 7 Set-Up Instructions
Go to the bottom of this post & download the BatchExcel_1_0_0_xx.zip file. Go to the Power Apps home page (https://make.powerapps.com/). Select Solutions on the left-side menu, select Import solution, Browse your files & select the BatchExcel_1_0_0_xx.zip file you just downloaded. Then select Next & follow the menu prompts to apply or create the required connections for the solution flows.
Once imported, find the Batch Excel solution in the list of solution & click it to open the solution. Then click on the Excel Batch Upserts V7 item to open the flow. Once inside the flow, delete the PlaceholderValue Delete after import action.
Open the Office Script Batch Update compose action to the BatchUpdateV7 script code. Select everything inside the compose input & control + C copy it to the clipboard.
Then find & open an Excel file in Excel Online. Go to the Automate tab, click on All Scripts & then click on New Script.
When the new script opens, select everything in the script & control + V paste the BatchUpdateV7 script from the clipboard into the menu. Then rename the script BatchUpdateV7 & save it. That should make the BatchUpdateV7 reference-able in the later Run script flow action.
Do the same process to import the BatchCreateV7 script.
Then go to the List rows Sample source data action. If you are going to use an Excel table as the source of updated data, then you can fill in the Location, Document Library, File, & Table information on this action. If you are going to use a different source of updated data like SharePoint, SQL, Dataverse, an API call, etc, then delete the List rows Sample source data placeholder action & insert your new get data action for your other source of updated data.
Following that, go to the Excel batch update & create scope. Open the PrimaryKeyColumnName action, remove the placeholder values in the action input & input the column name of the unique primary key for your destination Excel table. For example, I use ID for the sample data.
Then go to the SelectGenerateData action.
If you replaced the List rows Sample source data action with a new get data action, then you will need to replace the values dynamic content from that sample action with the new values dynamic content of your new get data action (the dynamic content that outputs a JSON array of the updated data).
In either case, you will need to input the table header names from the destination Excel table on the left & the dynamic content for the updated values from the updated source action on the right. You MUST include the column header for the destination primary key column & the the primary key values from the updated source data here for the Update script to match up what rows in the destination table need which updates from the source data. All the other columns are optional / you only need to include them if you want to update their values.
After you have added all the columns & updated data you want to the SelectGenerateData action, then you can move to the Run script Update Excel rows action. Here add the Location, Document Library, File Name, Script, Table Name, Primary Key Column Name, ForceMode1Processing, & UpdatedData. You will likely need to select the right-side button to switch the UpdatedData input to a single array input before inserting the dynamic content for the SelectGenerateData action.
Then open the Condition If new records continue else skip action & open the Run script Create Excel rows action. In this run script action input the Location, Document Library, Script, Table Name, & CreateData. You again will likely need to select the right-side button to change the CreateData input to a single array input before inserting the dynamic content for the Filter array Get records not found in table output.
If you need just a batch update, then you can remove the Filter array Get records not found in table & Run script Create Excel rows actions.
If you need just a batch create, then you can replace the Run script Batch update rows action with the Run script Batch create rows action, delete the update script action, and remove the remaining Filter array Get records not found in table action. Then any updated source data sent to the SelectGenerateData action will just be created, it won't check for rows to update.
Thanks for any feedback,
Please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86).
And reach out on LinkedIn (https://www.linkedin.com/in/kolota/) if you want to hire me to consult or build more custom Microsoft solutions for you.
Office Script Code
(Also included in a Compose action at the top of the template flow)
Batch Update Script Code: https://drive.google.com/file/d/1kfzd2NX9nr9K8hBcxy60ipryAN4koStw/view?usp=sharing
Batch Create Script Code: https://drive.google.com/file/d/13OeFdl7em8IkXsti45ZK9hqDGE420wE9/view?usp=sharing
(ExcelBatchUpsertV7 is the core piece, ExcelBatchUpsertV7b includes a Do until loop set-up if you plan on updating and/or creating more than 1000 rows on large tables.)
watch?v=HiEU34Ix5gA
The action looks like this:
Looking at the input, the JSON is being cut off and only has about half the records, of which the final one only contains a subset of the rows and is missing the PK.
Tracing this back through the steps, it looks like in the ProcessAllCells section the SelectDestinationIndex actions are not processing the full JSON array and half is being left behind. It is a large table I'm processing (11,500 rows with 32 columns)
Any suggestions?
Yes, the ProcessAllCells section is set to handle up to 210,000 cells that you want to update at a time. How many rows & columns are you updating each time you run the flow? If you're trying to load more than 210,000 cells (columns x rows) of updates in one go, then it either needs more SelectDestinationIndexCells actions or you will need to use the version b so it can process in several loops/chunks.
If you can share the number of columns & rows or cells you ultimately want to update at a time, then I can adjust the flow & send you a copy with the expanded capacity.
However, if you are continuously adding +210,000 cells to your table, you'll likely exceed the 1million cell table limit soon & need to use the version b to update things anyway.
I am using version b at the moment, and I've tripled the actions in the ProcessAllCells section.
The source of my data is a power bi dataset that refreshes daily. So the table of updated values will have mostly the same data, so will update nearly every row of the 11,500 in the excel table and every few days this will increase by a few rows. So it'll take some time for the table to reach 1 million cells as it only creates a couple of new rows each time. Updating the existing records will be most of what the flow/script is doing. So I suppose each run there will be about 368,000 cells but this will grow gradually over time.
Looks like it's now working but is stuck in the do until loop, as all records have been updated but the flow is still going after an hour.
The source table is about 105k cells (18k rows x 6 columns). This grows by approx 1200 cells every day.
@woolie
Ahh okay. If you're already on the version b, the problem isn't the ProcessAllCells piece, it's likely that you replaced the Excel List rows action without replicating the logic that splits the data into batches & continually selects the next batch of data until all the data has been passed through. So what you have is an infinite loop continually processing the entire updated datasource.
Similar to others who asked about switching out the Excel with a SharePoint get items, you need to go to the SelectGenerateUpdateData action, and in the From field instead of using the regular Power BI action outputs, use some logic like...
take(skip(InsertPBIOutputsHere, mul(iterationIndexes('Do_until_Update'), variables('BatchSize'))), variables('BatchSize'))
That will continually take the next batch size worth of items from the Power BI outputs & process just that batch before moving on to the next.
Also, move the Power BI action pulling the data above the Do Until loop so it isn't continually re-reading the data in. The Excel action was just inside the loop to handle Excel tables with more than 100,000 rows. If you later need something for more than 100,000 rows, then you will need to figure out a way to work that batch selection logic into the Power BI query itself, so it skips the batch size * loop count and selects the next batch size worth of data each time.
Hm, that shouldn’t be timing out under any circumstance really.
Can you share more of the error information & screenshots of your flow?
Looks like it's working now. Thanks so much for this 🙂
It does still take about 30 mins. Would altering the batch size or any other parameter speed this up?
Also I'd like to make update certain columns differently depending on whether it is updating an existing row or creating a new one. Any suggestions on how to achieve this?
The main run details shows the following error:
Overview of flow:
This appears to be where the flow fails:
@woolie
Yes, it should take about 1 minute per 1000 rows to update or create. So if your PBI read data action is taking ~10-20mins and the Excel batch actions are taking ~10-15mins per 12000 rows, then that makes sense.
You can try increasing the batch size, but any additional speed improvements there will be marginal and it will risk the actions timing out & failing more often.
If you can, you could try splitting the PBI data load on to a few different parallel branches. But that involves finding a way to query a different segment in the Power BI action on each branch. I don't know if there is some kind of Filter input on that action & if you could filter each branch on something in your data. Ex: if you had a status column, then you could split the load into "Pending", "In-Progress", & "Complete" filters.
Then you would need to use the Union( ) expression to combine all the outputs after the data load into a single set of your PBI data.
Okay thanks, I'll have a play around. Any suggestions for changing the values depending on whether it is updating an existing row or creating a new one?