11-29-2022 18:49 PM - last edited 11-29-2022 19:29 PM
Update & Create Excel Records 50-100x Faster
I was able to develop an Office Script to update rows and an Office Scripts to create rows from Power Automate array data. So instead of a flow creating a new action API call for each individual row update or creation, this flow can just send an array of new data and the Office Scripts will match up primary key values, update each row it finds, then create the rows it doesn't find.
And these Scripts do not require manually entering or changing any column names in the Script code.
• In testing for batches of 1000 updates or creates, it's doing ~2000 row updates or creates per minute, 50x faster than the standard Excel create row or update row actions at max 50 concurrency. And it accomplished all the creates or updates with less than 4 actions or only .4% of the standard 1000 action API calls.
• The Run Script code for processing data has 2 modes, the Mode 2 batch method that saves & updates a new instance of the table before posting batches of table ranges back to Excel & the Mode 1 row by row update calling on the Excel table.
The Mode 2 script batch processing method will activate for updates on tables less than 1 million cells. It does encounter more errors with larger tables because it is loading & working with the entire table in memory.
Shoutout to Sudhi Ramamurthy for this great batch processing addition to the template!
Code Write-Up: https://docs.microsoft.com/en-us/office/dev/scripts/resources/samples/write-large-dataset
Video: https://youtu.be/BP9Kp0Ltj7U
The Mode 1 script row by row method will activate for Excel tables with more than 1 million cells. But it is still limited by batch file size so updates on larger tables will need to run with smaller cloud flow batch sizes of less than 1000 in a Do until loop.
The Mode 1 row by row method is also used when the ForceMode1Processing field is set to Yes.
Be aware that some characters in column names, like \ / - _ . : ; ( ) & $ may cause errors when processing the data. Also backslashes \ in the data, which are usually used to escape characters in strings, may cause errors when processing the JSON.
Version 7 Note
Diverting from what is shown in the video, I was able to remove almost all the flow actions in the "Match new and existing data key values then batch update" scope after replicating their functions in the scripts themselves. The flow now goes directly from the "SelectGenerateData" action to the "Run script Update Excel rows" action and the script handles matching up the UpdatedData JSON keys/field names to the destination table headers.
Also version 7 changes the primary key set up in the SelectGenerateData and changes the logic for skipping cell value updates & blanking out cell values.
Now the primary key column name from the destination table must be present in the SelectGenerateData action with the dynamic content for the values you want to match up to update. No more 'PrimaryKey' line, the update script will automatically reference the primary key column in the SelectGenerateData data based on the PrimaryKeyColumnName input on the update script action in the flow.
Now leaving a blank "" in the SelectGenerateData action will make that cell value in the Excel table empty, while leaving a null in the SelectGenerateData action will skip updating / not alter the cell value that currently exists in the Excel table there.
Version 6 which looks closer to the version shown in the video can still be accessed here: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/m-p...
Version 7 Set-Up Instructions
Go to the bottom of this post & download the BatchExcel_1_0_0_xx.zip file. Go to the Power Apps home page (https://make.powerapps.com/). Select Solutions on the left-side menu, select Import solution, Browse your files & select the BatchExcel_1_0_0_xx.zip file you just downloaded. Then select Next & follow the menu prompts to apply or create the required connections for the solution flows.
Once imported, find the Batch Excel solution in the list of solution & click it to open the solution. Then click on the Excel Batch Upserts V7 item to open the flow. Once inside the flow, delete the PlaceholderValue Delete after import action.
Open the Office Script Batch Update compose action to the BatchUpdateV7 script code. Select everything inside the compose input & control + C copy it to the clipboard.
Then find & open an Excel file in Excel Online. Go to the Automate tab, click on All Scripts & then click on New Script.
When the new script opens, select everything in the script & control + V paste the BatchUpdateV7 script from the clipboard into the menu. Then rename the script BatchUpdateV7 & save it. That should make the BatchUpdateV7 reference-able in the later Run script flow action.
Do the same process to import the BatchCreateV7 script.
Then go to the List rows Sample source data action. If you are going to use an Excel table as the source of updated data, then you can fill in the Location, Document Library, File, & Table information on this action. If you are going to use a different source of updated data like SharePoint, SQL, Dataverse, an API call, etc, then delete the List rows Sample source data placeholder action & insert your new get data action for your other source of updated data.
Following that, go to the Excel batch update & create scope. Open the PrimaryKeyColumnName action, remove the placeholder values in the action input & input the column name of the unique primary key for your destination Excel table. For example, I use ID for the sample data.
Then go to the SelectGenerateData action.
If you replaced the List rows Sample source data action with a new get data action, then you will need to replace the values dynamic content from that sample action with the new values dynamic content of your new get data action (the dynamic content that outputs a JSON array of the updated data).
In either case, you will need to input the table header names from the destination Excel table on the left & the dynamic content for the updated values from the updated source action on the right. You MUST include the column header for the destination primary key column & the the primary key values from the updated source data here for the Update script to match up what rows in the destination table need which updates from the source data. All the other columns are optional / you only need to include them if you want to update their values.
After you have added all the columns & updated data you want to the SelectGenerateData action, then you can move to the Run script Update Excel rows action. Here add the Location, Document Library, File Name, Script, Table Name, Primary Key Column Name, ForceMode1Processing, & UpdatedData. You will likely need to select the right-side button to switch the UpdatedData input to a single array input before inserting the dynamic content for the SelectGenerateData action.
Then open the Condition If new records continue else skip action & open the Run script Create Excel rows action. In this run script action input the Location, Document Library, Script, Table Name, & CreateData. You again will likely need to select the right-side button to change the CreateData input to a single array input before inserting the dynamic content for the Filter array Get records not found in table output.
If you need just a batch update, then you can remove the Filter array Get records not found in table & Run script Create Excel rows actions.
If you need just a batch create, then you can replace the Run script Batch update rows action with the Run script Batch create rows action, delete the update script action, and remove the remaining Filter array Get records not found in table action. Then any updated source data sent to the SelectGenerateData action will just be created, it won't check for rows to update.
Thanks for any feedback,
Please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86).
And reach out on LinkedIn (https://www.linkedin.com/in/kolota/) if you want to hire me to consult or build more custom Microsoft solutions for you.
Office Script Code
(Also included in a Compose action at the top of the template flow)
Batch Update Script Code: https://drive.google.com/file/d/1kfzd2NX9nr9K8hBcxy60ipryAN4koStw/view?usp=sharing
Batch Create Script Code: https://drive.google.com/file/d/13OeFdl7em8IkXsti45ZK9hqDGE420wE9/view?usp=sharing
(ExcelBatchUpsertV7 is the core piece, ExcelBatchUpsertV7b includes a Do until loop set-up if you plan on updating and/or creating more than 1000 rows on large tables.)
watch?v=HiEU34Ix5gA
I am getting an error when I attempt to import any version of this flow. I have imported other packages successfully, but this one keeps failing. Has anyone else experienced this?
The simpler expressions of the new V5 are also easier to fix when they error in the copy & paste clipboard import method, so I'll share that back-up import method here for anyone facing issues with the standard import method.
For this, you'll need to copy the code below to your clipboard. It contains the code for a Power Automate Scope action that contains the entire flow. You can then go to a new action in any flow, go to the "My clipboard" tab of the new action menu and control + v paste it into that tab menu so you can select it to add it to the flow. The extra hurdle with this method is the longer expressions in the Select actions usually get messed up & will be greyed out.
You will need to go to each Select action, copy the expression text listed on the action, remove the expression text, use the map button on the right-side of the action to toggle to the table view & back to the single textbox view, then click in the textbox view to bring up the content menu & expression tab, then paste the expression text you copied back into the expression editor & remove any initial or end characters that do not belong in the expression input like the starting '@' or ending '}'.
Excel Batch Upsert V5 Scope Code to Copy & Paste Into the "My clipboard" Tab of a New Action Menu
{"id":"84977e71-9686-4abc-8e74-db0046cae9f9","brandColor":"#8C3900","connectionReferences":{"shared_excelonlinebusiness":{"connection":{"id":"/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness/connections/39f5641011a343e8aac23b8d01bee065"}}},"connectorDisplayName":"Control","icon":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMzIiIGhlaWdodD0iMzIiIHZlcnNpb249IjEuMSIgdmlld0JveD0iMCAwIDMyIDMyIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPg0KIDxwYXRoIGQ9Im0wIDBoMzJ2MzJoLTMyeiIgZmlsbD0iIzhDMzkwMCIvPg0KIDxwYXRoIGQ9Im04IDEwaDE2djEyaC0xNnptMTUgMTF2LTEwaC0xNHYxMHptLTItOHY2aC0xMHYtNnptLTEgNXYtNGgtOHY0eiIgZmlsbD0iI2ZmZiIvPg0KPC9zdmc+DQo=","isTrigger":false,"operationName":"Excel_Batch_Upsert_V5","operationDefinition":{"type":"Scope","actions":{"Office_Script_Batch_Update":{"type":"Compose","inputs":"function main(workbook: ExcelScript.Workbook,\n TableName: string,\n PrimaryKeyColumnName: string,\n ForceMode1Processing: string,\n UpdatedData: updateddata[],\n) {\n let table = workbook.getTable(TableName);\n let RowNum: number;\n let TableRange = table.getRange()\n let TableRangeNoHeader = table.getRangeBetweenHeaderAndTotal()\n //If it is a blank table, add an empty row to prevent errors.\n if (TableRangeNoHeader.getRowCount() < 1) { table.addRow() } else { };\n let TableData = TableRange.getValues()\n let ArrayPK = table.getColumn(PrimaryKeyColumnName).getRange().getValues().join(\"#|#\").split(\"#|#\")\n let ArrayPKErrors = new Array(\"\")\n let ColumnCount = TableRange.getColumnCount()\n let TableSize = (TableRange.getRowCount()) * (ColumnCount)\n let TableSizeBatchProcLimit = 1000000\n\n console.log(`Table size is ${TableSize} cells.`);\n if (TableSize > TableSizeBatchProcLimit) { console.log(`You have exceeded the ${TableSizeBatchProcLimit} total table cell limit for processing larger batches on larger destination tables in the office script, please either reduce your destination table size or use a batch size of 1000 or less in the cloud flow.`) }\n // If the table is less than 1 million cells & not something likely big enough to make errors in the V2 batch processing method then use the batch processing, else use the V1 row by row update method that isn't as affected by table size, but does have a lower cloud flow batch size limit.\n // So if someone generally uses large batch sizes, but then their table grows past 1 million cells, then this will revert to the V1 row by row processing with the smaller batch file size limit and the cloud flow will start to error and they will need to switch their flow settings back to smaller batch sizes as well.\n if (TableSize < TableSizeBatchProcLimit && ForceMode1Processing != \"Yes\") {\n \n //Iterate through each object item in the array from the flow\n for (let i = 0; i < UpdatedData.length; i++) {\n //If the record's Primary Key value is found continue, else post to error log\n if (ArrayPK.indexOf(UpdatedData[i].PK) > 0) {\n //Get the row number for the line to update by matching the foreign key from the other datasource to the primary key in Excel\n RowNum = ArrayPK.indexOf(UpdatedData[i].PK)\n\n //Iterate through each item or line of the current object\n for (let j = 0; j < Object.keys(UpdatedData[i]).length - 1; j++) {\n //Update each value for each item or column given\n TableData[RowNum][Number(Object.keys(UpdatedData[i])[j])] = UpdatedData[i][Number(Object.keys(UpdatedData[i])[j])]\n }\n }\n //Post PK not found value to ArrayPKErrors\n else { ArrayPKErrors.push(UpdatedData[i].PK) };\n }\n //Get array of 1st row formulas to re-apply to columns after posting the updates to the table\n let FirstRowFormulas = [\"\"]\n for (let c = 0; c < ColumnCount; c++) {\n FirstRowFormulas.push(TableRangeNoHeader.getColumn(c).getRow(0).getFormula());\n }\n FirstRowFormulas.shift();\n\n // If the entire table is smaller than 50,000 cells, then just post to the table. Else batch update.\n if (TableSize < 50000) {\n //Post table in memory to the Excel table\n TableRange.setValues(TableData);\n }\n else {\n\n // Run The Batch Update - (Batch update code source: https://docs.microsoft.com/en-us/office/dev/scripts/resources/samples/write-large-dataset)\n const CELLS_IN_BATCH = 15000;\n\n console.log(`Calling update range function...`);\n const updated = updateRangeInBatches(TableRange.getCell(0, 0), TableData, 10000);\n if (!updated) {\n console.log(`Update did not take place or complete. Check and run again.`);\n }\n\n function updateRangeInBatches(\n startCell: ExcelScript.Range,\n values: (string | boolean | number)[][],\n cellsInBatch: number\n 😞 boolean {\n const startTime = new Date().getTime();\n console.log(`Cells per batch setting: ${cellsInBatch}`);\n\n // Determine the total number of cells to write.\n const totalCells = values.length * values[0].length;\n console.log(`Total cells to update in the target range: ${totalCells}`);\n if (totalCells <= cellsInBatch) {\n console.log(`No need to batch -- updating directly`);\n updateTargetRange(startCell, values);\n return true;\n }\n\n // Determine how many rows to write at once.\n const rowsPerBatch = Math.floor(cellsInBatch / values[0].length);\n console.log(\"Rows per batch: \" + rowsPerBatch);\n let rowCount = 0;\n let totalRowsUpdated = 0;\n let batchCount = 0;\n\n // Write each batch of rows.\n for (let i = 0; i < values.length; i++) {\n rowCount++;\n if (rowCount === rowsPerBatch) {\n batchCount++;\n console.log(`Calling update next batch function. Batch#: ${batchCount}`);\n updateNextBatch(startCell, values, rowsPerBatch, totalRowsUpdated);\n\n // Write a completion percentage to help the user understand the progress.\n rowCount = 0;\n totalRowsUpdated += rowsPerBatch;\n console.log(`${((totalRowsUpdated / values.length) * 100).toFixed(1)}% Done`);\n }\n }\n console.log(`Updating remaining rows -- last batch: ${rowCount}`)\n if (rowCount > 0) {\n updateNextBatch(startCell, values, rowCount, totalRowsUpdated);\n }\n let endTime = new Date().getTime();\n console.log(`Completed ${totalCells} cells update. It took: ${((endTime - startTime) / 1000).toFixed(6)} seconds to complete. ${((((endTime - startTime) / 1000)) / cellsInBatch).toFixed(8)} seconds per ${cellsInBatch} cells-batch.`);\n return true;\n }\n /**\n * A helper function that computes the target range and updates. \n */\n function updateNextBatch(\n startingCell: ExcelScript.Range,\n data: (string | boolean | number)[][],\n rowsPerBatch: number,\n totalRowsUpdated: number\n ) {\n const newStartCell = startingCell.getOffsetRange(totalRowsUpdated, 0);\n const targetRange = newStartCell.getResizedRange(rowsPerBatch - 1, data[0].length - 1);\n console.log(`Updating batch at range ${targetRange.getAddress()}`);\n const dataToUpdate = data.slice(totalRowsUpdated, totalRowsUpdated + rowsPerBatch);\n try {\n targetRange.setValues(dataToUpdate);\n } catch (e) {\n throw `Error while updating the batch range: ${JSON.stringify(e)}`;\n }\n return;\n }\n /**\n * A helper function that computes the target range given the target range's starting cell\n * and selected range and updates the values.\n */\n function updateTargetRange(\n targetCell: ExcelScript.Range,\n values: (string | boolean | number)[][]\n ) {\n const targetRange = targetCell.getResizedRange(values.length - 1, values[0].length - 1);\n console.log(`Updating the range: ${targetRange.getAddress()}`);\n try {\n targetRange.setValues(values);\n } catch (e) {\n throw `Error while updating the whole range: ${JSON.stringify(e)}`;\n }\n return;\n }\n }\n //Re-apply the formulas from the 1st row to the entire columns after the update\n for (let f = 0; f < ColumnCount; f++) {\n if (FirstRowFormulas[f].toString().startsWith(\"=\")) {\n TableRangeNoHeader.getColumn(f).getRow(0).setFormula(FirstRowFormulas[f])\n TableRangeNoHeader.getColumn(f).getRow(0).autoFill(table.getRangeBetweenHeaderAndTotal().getColumn(f).getAddress(), ExcelScript.AutoFillType.fillDefault)\n }\n }\n }\n // Update row by row if the table is too large\n else {\n //Iterate through each object item in the array from the flow\n for (let i = 0; i < UpdatedData.length; i++) {\n //If the record's Primary Key value is found continue, else post to error log\n if (ArrayPK.indexOf(UpdatedData[i].PK) > 0) {\n //Get the row number for the line to update by matching the foreign key from the other datasource to the primary key in Excel\n RowNum = ArrayPK.indexOf(UpdatedData[i].PK)\n\n //Iterate through each item or line of the current object\n for (let j = 0; j < Object.keys(UpdatedData[i]).length - 1; j++) {\n //Update each value for each item or column given\n TableRange.getCell(RowNum, Number(Object.keys(UpdatedData[i])[j])).setValue(UpdatedData[i][Number(Object.keys(UpdatedData[i])[j])])\n }\n }\n //Post PK not found value to ArrayPKErrors\n else { ArrayPKErrors.push(UpdatedData[i].PK) }\n }\n }\n //Post ArrayPKErrors to flow results\n console.log(\"Any primary key values not found are listed in the result array.\")\n ArrayPKErrors.shift();\n return ArrayPKErrors;\n}\n//The 1st few column indexes are provided incase anyone edits this for something else later. But the current flow & scripts should work fine with only the PK column here.\ninterface updateddata {\n '0': (string | undefined),\n '1': (string | undefined),\n '2': (string | undefined),\n '3': (string | undefined),\n '4': (string | undefined),\n '5': (string | undefined),\n 'PK': (string | undefined)\n}","runAfter":{},"description":"Go to an online Excel table, go to the Automate tab, select New Script, then replace the placeholder script code by pasting all this in the code editor. You may want to name the script BatchUpdateV5.","metadata":{"operationMetadataId":"7922f637-e66f-45ad-a5e1-c69106b4a210"}},"Office_Script_Batch_Create":{"type":"Compose","inputs":"function main(workbook: ExcelScript.Workbook,\n TableName: string,\n PrimaryKeyColumnName: string,\n ForceMode1Processing: string,\n CreateData: createdata[],\n) {\n let table = workbook.getTable(TableName);\n let TableRange = table.getRange();\n let TableRangeNoHeader = table.getRangeBetweenHeaderAndTotal();\n //If it is a blank table, add an empty row to prevent errors.\n if (TableRangeNoHeader.getRowCount() < 1) { table.addRow() } else { };\n let TableData = TableRange.getValues();\n let PKColumnIndex = table.getColumn(PrimaryKeyColumnName).getIndex();\n CreateData = JSON.parse(JSON.stringify(CreateData).split('\"PK\"').join(`\"${PKColumnIndex}\"`))\n let ColumnCount = TableRange.getColumnCount();\n let InitialNumberOfTableRows = TableRange.getRowCount();\n let TableSize = (InitialNumberOfTableRows) * (ColumnCount);\n let TableSizeBatchProcLimit = 1000000;\n //let BlankLine: string [];\n //BlankLine = new Array(ColumnCount);\n let RowNum: number;\n RowNum = InitialNumberOfTableRows;\n //-1 to 0 index the RowNum\n RowNum = RowNum - 1;\n\n console.log(`Table size is ${TableSize} cells.`);\n if (TableSize > TableSizeBatchProcLimit) { console.log(`You have exceeded the ${TableSizeBatchProcLimit} total table cell limit for processing larger batches on larger destination tables in the office script, please either reduce your destination table size or use a batch size of 1000 or less in the cloud flow.`) }\n // If the table is less than 1 million cells & not something likely big enough to make errors in the V2 batch processing method then use the batch processing, else use the V1 row by row create method that isn't as affected by table size, but does have a lower cloud flow batch size limit.\n // So if someone generally uses large batch sizes, but then their table grows past 1 million cells, then this will revert to the V1 row by row processing with the smaller batch file size limit and the cloud flow will start to error and they will need to switch their flow settings back to smaller batch sizes as well.\n if (TableSize < TableSizeBatchProcLimit && ForceMode1Processing != \"Yes\") {\n //Iterate through each object item in the array from the flow\n for (let i = 0; i < CreateData.length; i++) {\n //Create an empty row at the end of the 2D table array & increase RowNum by 1 for the next line updating that row\n TableData.push(new Array(ColumnCount))\n //Earlier less efficient code --- TableData.push(\",\".repeat(ColumnCount - 1).split(\",\"));\n RowNum++;\n\n //Iterate through each item or line of the current object\n for (let j = 0; j < Object.keys(CreateData[i]).length; j++) {\n //Create each value for each item or column given\n TableData[RowNum][Number(Object.keys(CreateData[i])[j])] = CreateData[i][Number(Object.keys(CreateData[i])[j])];\n }\n }\n //Get array of 1st row formulas to re-apply to columns after posting the updates to the table\n let FirstRowFormulas = [\"\"]\n for (let c = 0; c < ColumnCount; c++) {\n FirstRowFormulas.push(TableRangeNoHeader.getColumn(c).getRow(0).getFormula());\n }\n FirstRowFormulas.shift();\n\n // Run The Batch Create - (Batch update code source: https://docs.microsoft.com/en-us/office/dev/scripts/resources/samples/write-large-dataset)\n const CELLS_IN_BATCH = 15000;\n\n console.log(`Calling update range function...`);\n const updated = updateRangeInBatches(TableRange.getCell(0, 0), TableData, 10000);\n if (!updated) {\n console.log(`Update did not take place or complete. Check and run again.`);\n }\n\n function updateRangeInBatches(\n startCell: ExcelScript.Range,\n values: (string | boolean | number)[][],\n cellsInBatch: number\n 😞 boolean {\n const startTime = new Date().getTime();\n console.log(`Cells per batch setting: ${cellsInBatch}`);\n\n // Determine the total number of cells to write.\n const totalCells = values.length * values[0].length;\n console.log(`Total cells to update in the target range: ${totalCells}`);\n if (totalCells <= cellsInBatch) {\n console.log(`No need to batch -- updating directly`);\n updateTargetRange(startCell, values);\n return true;\n }\n\n // Determine how many rows to write at once.\n const rowsPerBatch = Math.floor(cellsInBatch / values[0].length);\n console.log(\"Rows per batch: \" + rowsPerBatch);\n let rowCount = 0;\n let totalRowsUpdated = 0;\n let batchCount = 0;\n\n // Write each batch of rows.\n for (let i = 0; i < values.length; i++) {\n rowCount++;\n if (rowCount === rowsPerBatch) {\n batchCount++;\n console.log(`Calling update next batch function. Batch#: ${batchCount}`);\n updateNextBatch(startCell, values, rowsPerBatch, totalRowsUpdated);\n\n // Write a completion percentage to help the user understand the progress.\n rowCount = 0;\n totalRowsUpdated += rowsPerBatch;\n console.log(`${((totalRowsUpdated / values.length) * 100).toFixed(1)}% Done`);\n }\n }\n console.log(`Updating remaining rows -- last batch: ${rowCount}`)\n if (rowCount > 0) {\n updateNextBatch(startCell, values, rowCount, totalRowsUpdated);\n }\n let endTime = new Date().getTime();\n console.log(`Completed ${totalCells} cells update. It took: ${((endTime - startTime) / 1000).toFixed(6)} seconds to complete. ${((((endTime - startTime) / 1000)) / cellsInBatch).toFixed(8)} seconds per ${cellsInBatch} cells-batch.`);\n return true;\n }\n /**\n * A helper function that computes the target range and updates. \n */\n function updateNextBatch(\n startingCell: ExcelScript.Range,\n data: (string | boolean | number)[][],\n rowsPerBatch: number,\n totalRowsUpdated: number\n ) {\n const newStartCell = startingCell.getOffsetRange(totalRowsUpdated, 0);\n const targetRange = newStartCell.getResizedRange(rowsPerBatch - 1, data[0].length - 1);\n console.log(`Updating batch at range ${targetRange.getAddress()}`);\n const dataToUpdate = data.slice(totalRowsUpdated, totalRowsUpdated + rowsPerBatch);\n try {\n targetRange.setValues(dataToUpdate);\n } catch (e) {\n throw `Error while updating the batch range: ${JSON.stringify(e)}`;\n }\n return;\n }\n /**\n * A helper function that computes the target range given the target range's starting cell\n * and selected range and updates the values.\n */\n function updateTargetRange(\n targetCell: ExcelScript.Range,\n values: (string | boolean | number)[][]\n ) {\n const targetRange = targetCell.getResizedRange(values.length - 1, values[0].length - 1);\n console.log(`Updating the range: ${targetRange.getAddress()}`);\n try {\n targetRange.setValues(values);\n } catch (e) {\n throw `Error while updating the whole range: ${JSON.stringify(e)}`;\n }\n return;\n }\n //Re-apply the formulas from the 1st row to the entire columns after the update\n for (let f = 0; f < ColumnCount; f++) {\n if (FirstRowFormulas[f].toString().startsWith(\"=\")) {\n TableRangeNoHeader.getColumn(f).getRow(0).setFormula(FirstRowFormulas[f])\n TableRangeNoHeader.getColumn(f).getRow(0).autoFill(table.getRangeBetweenHeaderAndTotal().getColumn(f).getAddress(), ExcelScript.AutoFillType.fillDefault)\n }\n }\n }\n // Create row by row if the table is too large\n else {\n let Blank2dArray = new Array(CreateData.length).fill(null).map(() => new Array(ColumnCount).fill(null));\n let TableData = Blank2dArray;\n //Iterate through each object item in the array from the flow\n for (let i = 0; i < CreateData.length; i++) {\n //i is also the row number\n //Iterate through each item or line of the current object\n for (let j = 0; j < Object.keys(CreateData[i]).length; j++) {\n //Create each value for each item or column given\n TableData[i][Number(Object.keys(CreateData[i])[j])] = CreateData[i][Number(Object.keys(CreateData[i])[j])];\n }\n }\n //Append the in-memory TableData to the Excel table\n //Split the in-memory TableData into batches that addRows can handle without error & loop through to add each batch in addRows\n let BatchSize = 200;\n //length - 1 prevents a later slice(n, n) or slice(1000, 1000) which would return a null array & error the addRows\n let NumBatches = Math.floor((TableData.length - 1) / BatchSize);\n for (let k = 0; k <= NumBatches; k++) {\n let BatchRows = TableData.slice(k * BatchSize, (k + 1) * BatchSize)\n table.addRows(-1, BatchRows);\n }\n }\n}\n//The 1st few column indexes are provided incase anyone edits this for something else later. But the current flow & scripts should work fine with only the PK column here.\ninterface createdata {\n '0': (string | undefined),\n '1': (string | undefined),\n '2': (string | undefined),\n '3': (string | undefined),\n '4': (string | undefined),\n '5': (string | undefined),\n 'PK': (string | undefined)\n}","runAfter":{"Office_Script_Batch_Update":["Succeeded"]},"description":"Go to an online Excel table, go to the Automate tab, select New Script, then replace the placeholder script code by pasting all this in the code editor. You may want to name the script BatchCreateV5.","metadata":{"operationMetadataId":"f8ccc556-b345-429e-8c51-af7bb6b42452"}},"List_rows_Sample_source_data_-Placeholder-":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_excelonlinebusiness","operationId":"GetItems","apiId":"/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness"},"parameters":{"source":"me","drive":"b!y2rxipfTvUi7ALZMiyxLa89aQqkbMORMksjFtKI_dfKgJA8V37mjQICJm9mYgy-T","file":"01JWBUU4HB4HVGZLNYHNDLYN3M3UGHB2UT","table":"{00000000-000C-0000-FFFF-FFFF00000000}","$top":2000,"dateTimeFormat":"ISO 8601"},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{"Office_Script_Batch_Create":["Succeeded"]},"description":"Pagination set to 25000. Initial top count set to a 2000 row batch.","runtimeConfiguration":{"paginationPolicy":{"minimumItemCount":25000}},"limit":{"timeout":"PT5M"},"metadata":{"01PCEUDVFYX74MYI46EJCZF6AUHSUSPNMU":"/SourceData.xlsx","operationMetadataId":"408aee2b-05c0-44e2-8468-78c78fb0530d","tableId":"{00000000-000C-0000-FFFF-FFFF00000000}","01JWBUU4EEP2QHP4DFINBKWWCSYPOMRAA5":"/SourceData.xlsx","01JWBUU4HB4HVGZLNYHNDLYN3M3UGHB2UT":"/SourceTable.xlsx"}},"Match_new_and_existing_data_key_values_then_batch_update":{"type":"Scope","actions":{"List_rows_Get_header_sample":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_excelonlinebusiness","operationId":"GetItems","apiId":"/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness"},"parameters":{"source":"me","drive":"b!y2rxipfTvUi7ALZMiyxLa89aQqkbMORMksjFtKI_dfKgJA8V37mjQICJm9mYgy-T","file":"01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5","table":"{00000000-000C-0000-FFFF-FFFF00000000}","$top":2},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{"SelectGenerateUpdateData":["Succeeded"]},"description":"Input the Location, Doc Library, File, & Table information you normally would to get the table you want to update. It isn't meant to get the data, just the headers, so it is limited to just the 1st few rows.","metadata":{"01OVBUQJMSNAHSGEXJCNFISQKOJWJ2ZY3N":"/Book.xlsx","operationMetadataId":"64ff543c-8881-4e6c-b49e-7bbadd6fbdd2","tableId":"{00000000-000C-0000-FFFF-FFFF00000000}","01PCEUDVFYX74MYI46EJCZF6AUHSUSPNMU":"/Book.xlsx","01PCEUDVCCZ7IYLT5RUBC3N6VXZWCYFNNI":"/DestinationData.xlsx","01JWBUU4DRJEFUSQGBG5E2QSRSXUIHVZNB":"/DestinationData.xlsx","01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5":"/DestinationTable.xlsx"}},"SelectGenerateUpdateData":{"type":"Select","inputs":{"from":"@outputs('List_rows_Sample_source_data_-Placeholder-')?['body/value']","select":{"PrimaryKey":"@item()?['SourceC1']","DestinationC2":"@item()?['SourceC2']","DestinationC3":"@item()?['SourceC3']","DestinationC4":"@item()?['SourceC4']","DestinationC5":"@item()?['SourceC5']","DestinationC6":"@item()?['SourceC6']","DestinationC17":"@null","DestinationC18":"","Destination-C 27":"1","DestinationC28":"@if(empty(item()?['SourceC6']), 0, 1)"}},"runAfter":{},"description":" Except PrimaryKey, input EXACT Excel column names on the left & updated values from other sources on the right. Set the PrimaryKey row /w matching key values from the other source. Blank '' values don't alter data, null values update cells to empty.","metadata":{"operationMetadataId":"b221bdd9-0a32-43e4-8498-a8116e6b0a83"}},"Run_script_Update_Excel_rows":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_excelonlinebusiness","operationId":"RunScriptProd","apiId":"/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness"},"parameters":{"source":"me","drive":"b!y2rxipfTvUi7ALZMiyxLa89aQqkbMORMksjFtKI_dfKgJA8V37mjQICJm9mYgy-T","file":"01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5","scriptId":"ms-officescript%3A%2F%2Fonedrive_business_itemlink%2F01JWBUU4CI62ZOXSKDIJD2KRGYGN2R4NX4","ScriptParameters/TableName":"Table1","ScriptParameters/PrimaryKeyColumnName":"GeneratePK","ScriptParameters/ForceMode1Processing":"No","ScriptParameters/UpdatedData":"@outputs('ComposeDestinationJSON')"},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{"ComposeDestinationJSON":["Succeeded"]},"description":"Change Location, Doc Library, File, Script, TableName, PrimaryKeyColumnName, & KeepFormula fields to match your use-case. Make sure the primary key column listed is the column you want to be matched with the \"Generate update data\" PrimaryKey values.","limit":{"timeout":"PT15M"},"metadata":{"01PCEUDVFYX74MYI46EJCZF6AUHSUSPNMU":"/Book.xlsx","operationMetadataId":"713f593f-c5e4-43b6-ad08-3a71051bbf87","tableId":null,"01PCEUDVCCZ7IYLT5RUBC3N6VXZWCYFNNI":"/DestinationData.xlsx","01JWBUU4DRJEFUSQGBG5E2QSRSXUIHVZNB":"/DestinationData.xlsx","01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5":"/DestinationTable.xlsx"}},"SelectDestinationColumnNumbersByName":{"type":"Select","inputs":{"from":"@range(0, sub(length(body('SelectReformatHeaders')), 1))","select":{"@{split(body('SelectReformatHeaders')[item()], '\":')?[0]}":"@item()"}},"runAfter":{"SelectReformatHeaders":["Succeeded"]},"metadata":{"operationMetadataId":"87bfb21b-ef06-4f23-99dc-20b0b95e4f0c"}},"ReformatHeaderObject":{"type":"Compose","inputs":"@json(replace(replace(replace(string(body('SelectDestinationColumnNumbersByName')), '[{', '{'), '}]', '}'), '},{', ','))","runAfter":{"SelectDestinationColumnNumbersByName":["Succeeded"]},"description":"Convert the array of JSON objects to a single searchable JSON object","metadata":{"operationMetadataId":"69b1d189-d56f-45fc-8dbb-a49740832a64"}},"SelectArrayOfArraysUpdateData":{"type":"Select","inputs":{"from":"@body('SelectAddRecordBreakToEach')","select":"@join(json(replace(\r\nreplace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(\r\nconcat('[', skip(take(string(item()), sub(length(string(item())), 1)), 1), ']'),\r\n'\":\"', '\",\":\",\"'), '\":\",\":\",', '\",\":\",\":'),\r\n'\":0', '\",\":\",0'), '\",\":\",\",\":\",0', '\",\":\",\":0'),\r\n'\":1', '\",\":\",1'), '\",\":\",\",\":\",1', '\",\":\",\":1'),\r\n'\":2', '\",\":\",2'), '\",\":\",\",\":\",2', '\",\":\",\":2'),\r\n'\":3', '\",\":\",3'), '\",\":\",\",\":\",3', '\",\":\",\":3'),\r\n'\":4', '\",\":\",4'), '\",\":\",\",\":\",4', '\",\":\",\":4'),\r\n'\":5', '\",\":\",5'), '\",\":\",\",\":\",5', '\",\":\",\":5'),\r\n'\":6', '\",\":\",6'), '\",\":\",\",\":\",6', '\",\":\",\":6'),\r\n'\":7', '\",\":\",7'), '\",\":\",\",\":\",7', '\",\":\",\":7'),\r\n'\":8', '\",\":\",8'), '\",\":\",\",\":\",8', '\",\":\",\":8'),\r\n'\":9', '\",\":\",9'), '\",\":\",\",\":\",9', '\",\":\",\":9'),\r\n'\":[', '\",\":\",['), '\",\":\",\",\":\",[', '\",\":\",\":['),\r\n'\":{', '\",\":\",{'), '\",\":\",\",\":\",{', '\",\":\",\":{'),\r\n'\":null', '\",\":\",\"null\"'), '\",\":\",\",\":\",\"null\"', '\",\":\",\":\"null\"'),\r\n'\":true', '\",\":\",\"true\"'), '\",\":\",\",\":\",\"true\"', '\",\":\",\":\"true\"'),\r\n'\":false', '\",\":\",\"false\"'), '\",\":\",\",\":\",\"false\"', '\",\":\",\":\"false\"'),\r\n',\":\",', ','), '_#-', '_#_')), '_#-')"},"runAfter":{"SelectAddRecordBreakToEach":["Succeeded"]},"description":"And filter out the JSON colon : items","metadata":{"operationMetadataId":"a5f08622-a062-4a8a-9c34-b09b0bd9b895"}},"SelectAddRecordBreakToEach":{"type":"Select","inputs":{"from":"@body('SelectGenerateUpdateData')","select":"@removeProperty(\r\naddProperty(\r\naddProperty(item(), 'PK', item()['PrimaryKey']),\r\n'_|-', ''),\r\n'PrimaryKey')"},"runAfter":{"ReformatHeaderObject":["Succeeded"]},"metadata":{"operationMetadataId":"33f058fd-c61a-4580-8957-7c263906bd72"}},"ComposeDestinationJSON":{"type":"Compose","inputs":"@json(concat('[{',\r\nskip(\r\ntake(\r\nreplace(join(body('FilterOnlyKeyValueStrings'), ','), ',\"_|-\":\"\",', '},{'),\r\nsub(length(replace(join(body('FilterOnlyKeyValueStrings'), ','), ',\"_|-\":\"\",', '},{')), 9)),\r\n0),\r\n'}]'))","runAfter":{"FilterOnlyKeyValueStrings":["Succeeded"]},"description":"Exceed the 210,000 ProcessAllCells limit & this may error & not parse. Either reduce your batch size or add more SelectDestinationIndexCells # actions to the ProcessAllCells Scope & FilterOnlyKeyValueStrings.","metadata":{"operationMetadataId":"9fc0353c-c586-4590-9366-c6df90dda58a"}},"FilterOnlyKeyValueStrings":{"type":"Query","inputs":{"from":"@json(replace(replace(concat(\r\nstring(body('SelectDestinationIndexCells')), '-#-',\r\nstring(body('SelectDestinationIndexCells_2')), '-#-',\r\nstring(body('SelectDestinationIndexCells_3')), '-#-',\r\nstring(body('SelectDestinationIndexCells_4')), '-#-',\r\nstring(body('SelectDestinationIndexCells_5'))\r\n), '-#-[]', ''), ']-#-[', ','))","where":"@not(equals(item(), '_||'))"},"runAfter":{"ProcessAllCells":["Succeeded"]},"description":"Combine all processed cell arrays & filter blanks","metadata":{"operationMetadataId":"d6b0a71e-6b59-42b5-ac8e-771cf48d6814"}},"ProcessAllCells":{"type":"Scope","actions":{"SelectDestinationIndexCells":{"type":"Select","inputs":{"from":"@range(0, max(Min(100000, length(outputs('CustomDelimitedCells'))), 0))","select":"@if(equals(mod(item(), 2), 0),\r\nif(equals(outputs('CustomDelimitedCells')[add(item(), 1)], '\"\"'), '_||',\r\nif(or(equals(outputs('CustomDelimitedCells')[item()], 'PK'), equals(outputs('CustomDelimitedCells')[item()], '_|-')),\r\nconcat('\"', outputs('CustomDelimitedCells')[item()], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"'),\r\nconcat('\"', outputs('ReformatHeaderObject')[outputs('CustomDelimitedCells')[item()]], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"')\r\n)\r\n),\r\n'_||'\r\n)"},"runAfter":{"CustomDelimitedCells":["Succeeded"]},"description":"1st 100000 of flattened array. Will error if the Generate update data mapping doesn't exactly match the destination columns.","metadata":{"operationMetadataId":"e1692a16-ad6c-4d26-a285-12522d144839"}},"CustomDelimitedCells":{"type":"Compose","inputs":"@split(Join(body('SelectArrayOfArraysUpdateData'), '_#-'), '_#-')","runAfter":{},"metadata":{"operationMetadataId":"39e5ede0-3b02-469c-b6f7-15221fee5668"}},"SelectDestinationIndexCells_2":{"type":"Select","inputs":{"from":"@range(100000, max(min(sub(length(outputs('CustomDelimitedCells')), 100000), 100000), 0))","select":"@if(equals(mod(item(), 2), 0),\r\nif(equals(outputs('CustomDelimitedCells')[add(item(), 1)], '\"\"'), '_||',\r\nif(or(equals(outputs('CustomDelimitedCells')[item()], 'PK'), equals(outputs('CustomDelimitedCells')[item()], '_|-')),\r\nconcat('\"', outputs('CustomDelimitedCells')[item()], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"'),\r\nconcat('\"', outputs('ReformatHeaderObject')[outputs('CustomDelimitedCells')[item()]], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"')\r\n)\r\n),\r\n'_||'\r\n)"},"runAfter":{"CustomDelimitedCells":["Succeeded"]},"description":"2nd 100000 of flattened array. Will error if the Generate update data mapping doesn't exactly match the destination columns.","metadata":{"operationMetadataId":"e1692a16-ad6c-4d26-a285-12522d144839"}},"SelectDestinationIndexCells_3":{"type":"Select","inputs":{"from":"@range(200000, max(min(sub(length(outputs('CustomDelimitedCells')), 200000), 100000), 0))","select":"@if(equals(mod(item(), 2), 0),\r\nif(equals(outputs('CustomDelimitedCells')[add(item(), 1)], '\"\"'), '_||',\r\nif(or(equals(outputs('CustomDelimitedCells')[item()], 'PK'), equals(outputs('CustomDelimitedCells')[item()], '_|-')),\r\nconcat('\"', outputs('CustomDelimitedCells')[item()], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"'),\r\nconcat('\"', outputs('ReformatHeaderObject')[outputs('CustomDelimitedCells')[item()]], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"')\r\n)\r\n),\r\n'_||'\r\n)"},"runAfter":{"CustomDelimitedCells":["Succeeded"]},"description":"3rd 100000 of flattened array. Will error if the Generate update data mapping doesn't exactly match the destination columns.","metadata":{"operationMetadataId":"e1692a16-ad6c-4d26-a285-12522d144839"}},"SelectDestinationIndexCells_4":{"type":"Select","inputs":{"from":"@range(300000, max(min(sub(length(outputs('CustomDelimitedCells')), 300000), 100000), 0))","select":"@if(equals(mod(item(), 2), 0),\r\nif(equals(outputs('CustomDelimitedCells')[add(item(), 1)], '\"\"'), '_||',\r\nif(or(equals(outputs('CustomDelimitedCells')[item()], 'PK'), equals(outputs('CustomDelimitedCells')[item()], '_|-')),\r\nconcat('\"', outputs('CustomDelimitedCells')[item()], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"'),\r\nconcat('\"', outputs('ReformatHeaderObject')[outputs('CustomDelimitedCells')[item()]], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"')\r\n)\r\n),\r\n'_||'\r\n)"},"runAfter":{"CustomDelimitedCells":["Succeeded"]},"description":"4th 100000 of flattened array. Will error if the Generate update data mapping doesn't exactly match the destination columns.","metadata":{"operationMetadataId":"e1692a16-ad6c-4d26-a285-12522d144839"}},"SelectDestinationIndexCells_5":{"type":"Select","inputs":{"from":"@range(400000, max(min(sub(length(outputs('CustomDelimitedCells')), 400000), 100000), 0))","select":"@if(equals(mod(item(), 2), 0),\r\nif(equals(outputs('CustomDelimitedCells')[add(item(), 1)], '\"\"'), '_||',\r\nif(or(equals(outputs('CustomDelimitedCells')[item()], 'PK'), equals(outputs('CustomDelimitedCells')[item()], '_|-')),\r\nconcat('\"', outputs('CustomDelimitedCells')[item()], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"'),\r\nconcat('\"', outputs('ReformatHeaderObject')[outputs('CustomDelimitedCells')[item()]], '\":\"', replace(replace(outputs('CustomDelimitedCells')[add(item(), 1)], '\"', '″'), 'null', ''), '\"')\r\n)\r\n),\r\n'_||'\r\n)"},"runAfter":{"CustomDelimitedCells":["Succeeded"]},"description":"5th 100000 of flattened array. Will error if the Generate update data mapping doesn't exactly match the destination columns.","metadata":{"operationMetadataId":"e1692a16-ad6c-4d26-a285-12522d144839"}}},"runAfter":{"SelectArrayOfArraysUpdateData":["Succeeded"]},"description":"The Range( ) expression is required for this & has a limit of 100,000 items. This breaks things up if someone wants to load large tables up to 210,000 cells, ex: 7 columns & 30000 rows. Exceed 210,000 & the later JSON action will error & not parse.","metadata":{"operationMetadataId":"fec90f32-16ae-41f9-9b14-934fd19a3df3"}},"SelectReformatHeaders":{"type":"Select","inputs":{"from":"@skip(split(\r\nreplace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(replace(\r\nstring(first(outputs('List_rows_Get_header_sample')?['body/value'])),\r\n'\":0', '\":\"0'),\r\n'\":1', '\":\"1'),\r\n'\":2', '\":\"2'),\r\n'\":3', '\":\"3'),\r\n'\":4', '\":\"4'),\r\n'\":5', '\":\"5'),\r\n'\":6', '\":\"6'),\r\n'\":7', '\":\"7'),\r\n'\":8', '\":\"8'),\r\n'\":9', '\":\"9'),\r\n'\":[', '\":\"['),\r\n'\":{', '\":\"{'),\r\n'\":null', '\":\"null\"'),\r\n'\":true', '\":\"true\"'),\r\n'\":false', '\":\"false\"'),\r\n'\":\"'), 2)","select":"@last(split(item(), ',\"'))"},"runAfter":{"List_rows_Get_header_sample":["Succeeded"]},"metadata":{"operationMetadataId":"dfd7c59f-8346-4a15-8f41-b5fe6200c746"}}},"runAfter":{"List_rows_Sample_source_data_-Placeholder-":["Succeeded"]},"description":"Fill out the Generate update data, List rows Get header sample, & Run script Update Excel rows actions with your data, workbook, & script information.","metadata":{"operationMetadataId":"175119c4-e881-4899-81d6-14aaa7f741f8"}},"Get_records_not_in_found_in_table_then_batch_create":{"type":"Scope","actions":{"Filter_array_Get_records_not_found_in_table":{"type":"Query","inputs":{"from":"@outputs('ComposeDestinationJSON')","where":"@contains(outputs('Run_script_Update_Excel_rows')?['body/result'], item()['PK'])"},"runAfter":{},"description":"Get the records from the batch DestinationJSON action where the record primary key is in the list of primary keys not found in the Run script Update Excel action","metadata":{"operationMetadataId":"c340f3a3-1696-4ebb-9d2b-3b0463ae3773"}},"Run_script_Create_Excel_rows":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_excelonlinebusiness","operationId":"RunScriptProd","apiId":"/providers/Microsoft.PowerApps/apis/shared_excelonlinebusiness"},"parameters":{"source":"me","drive":"b!y2rxipfTvUi7ALZMiyxLa89aQqkbMORMksjFtKI_dfKgJA8V37mjQICJm9mYgy-T","file":"01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5","scriptId":"ms-officescript%3A%2F%2Fonedrive_business_itemlink%2F01JWBUU4F2LHYL4XTPENFZJI3WGXKF6ZWM","ScriptParameters/TableName":"Table1","ScriptParameters/PrimaryKeyColumnName":"GeneratePK","ScriptParameters/ForceMode1Processing":"No","ScriptParameters/CreateData":"@body('Filter_array_Get_records_not_found_in_table')"},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{"Filter_array_Get_records_not_found_in_table":["Succeeded"]},"description":"Change Location, Doc Library, File, Script, TableName, PrimaryKeyColumnName, & KeepFormulas fields to match your use-case. Make sure the primary key column listed is the column you want to be matched with the \"Generate update data\" PrimaryKey values.","limit":{"timeout":"PT5M"},"metadata":{"01PCEUDVFYX74MYI46EJCZF6AUHSUSPNMU":"/Book.xlsx","operationMetadataId":"713f593f-c5e4-43b6-ad08-3a71051bbf87","tableId":null,"01PCEUDVCCZ7IYLT5RUBC3N6VXZWCYFNNI":"/DestinationData.xlsx","01JWBUU4DRJEFUSQGBG5E2QSRSXUIHVZNB":"/DestinationData.xlsx","01JWBUU4CQU6JVUFYHMJDIJK4FSNCANGT5":"/DestinationTable.xlsx"}}},"runAfter":{"Match_new_and_existing_data_key_values_then_batch_update":["Succeeded"]},"description":"Fill out the Run script Create Excel rows action with your data, workbook, & script information.","metadata":{"operationMetadataId":"9e2a036b-c6aa-461a-81f3-efed57aa2f71"}}},"runAfter":{}}}
New action menu tab to paste the code into...
New Scope in Your Flow...
Likely errors in the Select action expressions...
Again. You will need to go to each Select action, copy the expression text listed on the action, remove the expression text, use the map button on the right-side of the action to toggle to the table view & back to the single textbox view, then click in the textbox view to bring up the content menu & expression tab, then paste the expression text you copied back into the expression editor & remove any initial or end characters that do not belong in the expression input like the starting '@' or ending '}'.
Hi, Thank you so much sharing wonderful scripts to delete and update excel tables with speed. I am trying to utilize this for following use case:
I have 5 excel files on Sharepoint document library which has information about resources in 5 teams. Column names are same in all sources files, Size is approx. 100 rows in each source file. These files get updated with new resources coming in and existing leaving at times. I do want to get a consolidated excel table in destination file which have all current rows in source files with some filtering on source columns.
Currently , I have tried to use delete Script/power flow following up with upsert v5 script and power flow. It does work good to get data from 1 source file to consolidated file, but when I run it again with 2nd source file, it does overwrite rows which I got from 1st source file. Is there any way to tweak script so that it does not overwrite but concat the multiple source file tables rows in 1 consolidate table in destination file? I can share more details, if required, that you so much for your help.
@takolotaGreat work here! I have an issue I hope you (or anyone else!) can help me with.
I have a flow which takes a table and creates a new file with blank table for each project ID in the main table, and then batch create all the rows for each project into the corresponding blank table. The issue I have is that the script is not creating into the blank tables at all, but below, and I can't figure out why.
The log seems to have the right range.
Any ideas?
@berdavnes
I tried & couldn't replicate that issue on the newest version (v5).
The only known problem I ran into is part of the flow requires pulling the headers from the existing table. And if the table has no rows yet, then the List rows won't return anything & the flow won't be able to pull the header names.
For that you need to add an extra blank row to the template so the List rows action has something to pull & can get the header column names.
@takolota Thanks for your script! But same issue here on a Blank Table(cleared by Bulk Delete) with Bulk Create V5! All new data is out of the Table Range. That's weird!
++++
To mitigate this issue, my workaround is to use below script to resize table
Yes, so try adding another row beyond the default row. The default row isn’t actually a row, there is nothing a List rows action would return. I’m not even sure how you got the rows to populate after the table as the flow relies on pulling a row from the table to get the header names.
Maybe both of you have tried doing the batch create after a batch delete & there is something oddly specific to that scenario. I’ll have to try testing that case.
I posted a few weeks ago about having an issue importing, the solution you replied with ended up working perfectly. Thank you so much for the help with that!
I do have another question for you regarding a specific flow that I'm having an issue with. In this flow, I am deleting all rows from a table, so only the header row remains. After that, I need to insert about 1000 records. I have it working with the Microsoft add a row to excel function, but the performance is absolutely awful.
I have been trying to implement your solution to speed it up, but since I have 0 rows in my excel table when the insert process begins, there is no primary key to pass in. Is it possible to call the upsert without a primary key?
Thanks again, I really appreciate any help you may be able to offer.