02-01-2022 23:02 PM - last edited 11-15-2023 21:20 PM
Hello All,
I ran into some issues when pulling CSV data in from Power Automate Desktop because of commas in the actual data. I wanted to change the delimiter so I could more easily parse the data in a single Select action without commas in the actual data messing things up. I also may be parsing CSV files with hundreds or thousands of rows, so I didn’t want to use all my daily actions on this in a slow Apply to each loop.
Attached is the scope/flow I built so anyone can easily select their CSV data that has quotes around the comma-containing records, enter a new delimiter, and get the new delimiter separated data from the final compose action without the usual errors. And it only takes a few actions to do this, even on very large files.
I've found that many CSV files don't put quotes around their records with in-data commas, and this only works when there are quotes around those records. But if the file is saved as a text file, then it often puts quotes around the right records.
If you are using Power Automate Desktop, program the file to be saved as .txt and read that into your output variable.
It’s currently set to handle up to 50 comma-containing columns, but you can expand that to as many columns as needed by adding extra lines & expressions to the 1st Select action. Just follow the pattern by replacing some of the array numbers, like [50] with [51].
Also if your data has more unique values like an array with mixed quoted string data, Ex: ["string1", 2, 03/05/2022, "string3"], then this will create errors in the output.
The template for parsing CSV to JSON & entering it into a dataset uses the same change delimiter set-up: https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/CSV-to-Dataset/m-p/1508191#M584
*Copying the template scope into another flow may not work as it may mess up the expressions. You may need to start with a copy of this template and copy & paste the rest of your flow into this template flow.
Thanks for any feedback, & please subscribe to my YouTube channel (https://youtube.com/@tylerkolota?si=uEGKko1U8D29CJ86)
Version 3 Uploaded 03/26/2022 (Adjusted the 1st Select input so it can now also deal with in-data commas in the 1st column. Added more lines to the 1st Select so it can now handle up to 50 columns with commas in them.)
Google Drive Link: https://drive.google.com/file/d/11uT15hXY0VjnOKDvFxdVgkuMtXqTmA0c/view?usp=sharing
Version 4 Uploaded 04/09/2022
(More minor fixes & additions.
I adjusted several expressions so it can now handle a few more scenarios with arrays in the CSV data. It should handle any array that doesn't include double quotes and any array that is all strings with double quotes, so ["String1", "String2", "String3"], but it will have issues if it is a mixed array with some double-quoted strings and some other values, for example ["String", 4, 03/05/2022, "String2"] won't work.
I also adjusted how the LineBreak setting is set-up so it now uses the /r/n for the LineBreak. I also provided this link in the flow so anyone can look up the right string for the decodeUriComponent expression(s) if they happen to have different LineBreak characters. This change also made it possible to differentiate between in-data line-breaks and CSV row line-breaks on the files I tested, so it should now replace the in-data line-breaks, like the multiple-choice fields some sites use, with semi-colons. That should make those records much easier to deal with & parse in later actions.
I also looked over a problem with in-data trailing commas. I added a line in the settings where anyone can toggle whether they want it to adjust for trailing OR leading commas in the data, it just can't handle both in one dataset. So if one column in one row has ",String1 String2" and another column in another row has "String 3 String4," then it will have errors.)
Google Drive Link: https://drive.google.com/file/d/1ZbhFGVKHSpaH2Duv8qXwMnNww8czdgc4/view?usp=sharing
Version 5
More adjustments for comma edge cases and for instances with one comma-containing value following another in the file data.
Google Drive Link: https://drive.google.com/file/d/1il_wI9fJRk11YaI4EPQvk2efrbBNRBwr/view?usp=sharing
Update 06/01/2022
Microsoft Power Platform & Paul Murana recently did a video demonstration of how to handle CSV & other files in dataflows: https://youtu.be/8IvHxRnwJ7Q
But it currently only outputs to a dataverse or Dataverse for teams table.
@takolota ,
swapping out the v3 and introducing the v5 scope went very well.
Although I had a minor challenge with the copy/paste it was very easy.
We started having issues with the batch creation on Sharepoint List, some records were simply not arriving.
For example a CSV file with 3487 records would give us a list with ~2897
By inserting the V5 scope, things are now in full sync.
DataVerse is def our next step
Ah sorry about any issues with that earlier version.
I noticed the batch update timing out & missing a record or two with the 1000 batch size recently. If you have it set to 1000, you may want to set it to like 800.
I also recently released a V1.5 for that to make some of the inputs easier & tweak the skip count expression in the example Excel action. I don't think it would be worth changing anything out, but for future reference.
What happens if the Text of a column has a ; in it?
@mingo3369
If the text of a column already has a ; on the same column that has linebreaks, then it may be more difficult for you to parse the linebreaks afterwards.
You could always try changing the semicolon ; to something else in the Select Reformat expression.
Hello Everyone,
Microsoft Power Platform & Paul Murana recently did a video demonstration of how to handle CSV & other files in dataflows: https://youtu.be/8IvHxRnwJ7Q
This looks like a more well-tested solution with many additional data transformation options. The dataflows connector looks to be non-premium so if you have access to it, then I suggest using it.
This custom-build flow-only solution may be helpful for those without access to the right Power Apps environment & maybe some other minor edge cases, but this looks like the way forward for most people.
Thanks @takolota for all you have provided on this solution, I learned a ton from your examples! For those that are creating solutions for your organization, I would suggest just using Plumsail's CSV parser, it's only $25/month for 500 executions. I'm quite certain your org would pay for that versus you spending hours/days on making sure a custom parser is working as expected...it literally takes 5 mins to setup and start running.
Thanks @Anonymous, some people throughout the forums needed non-3rd party solutions for security reasons.
And it looks like dataflows should be a non-3rd-party solution, and free with most microsoft accounts, so I don’t know why people would use Plumsail or similar systems if CSV parsing is all they needed.
Ahh..yes good point @takolota on the non-third party policies, that makes sense. I will give this Dataflow solution a try just to help build my KB, so thanks again for sharing. Paul always makes things sound so easy in his videos so look forward to trying it out. : ) Cheers.
Just found something that really should have been mentioned in that dataflows video. It currently only uploads to dataverse or to dataverse for teams.
Of course they only provided a half solution.