I am using the step Create File From Paginated Report
which creates a CSV file
the output of the CSV file is as below:
{
"statusCode": 200,
"headers": {
"Pragma": "no-cache",
"Transfer-Encoding": "chunked",
"Strict-Transport-Security": "max-age=31536000; includeSubDomains",
"X-Frame-Options": "deny",
"X-Content-Type-Options": "nosniff",
"RequestId": "27bd1808-fc47-457c-b3b6-b91933d08050",
"Access-Control-Expose-Headers": "RequestId",
"Timing-Allow-Origin": "*",
"x-ms-apihub-cached-response": "true",
"x-ms-apihub-obo": "false",
"Cache-Control": "no-store, must-revalidate, no-cache",
"Date": "Wed, 16 Nov 2022 10:26:49 GMT",
"Content-Type": "text/csv",
"Content-Length": "944082"
},
"body": "ClientID,Username,FirstName,Surname,ClientProfile,Channel,Affiliate\r\n0000,0000,xxxx,xxxx,xxxx,xxxxx,xxxxx\r\0000,0000,xxxx,xxxxx,xxxxx,xxxxx\r\n
How do I create a HTML table from this CSV output file?
Solved! Go to Solution.
Ok, new and improved solution without the Apply to each, or having to add the { } to build up each object. Also, much more efficient and not using up your billable actions.
Full flow is below. I'll go into each of the actions.
Compose JSON contains your sample JSON data (including just 6 items in the second record).
Compose Array uses the following expression. It splits the body by \r\n and skips the first row since that contains our headers and we only want the data.
skip(split(outputs('Compose_JSON')?['body'], decodeUriComponent('%0D%0A')), 1)
Filter array just removes the last item (empty items). The expression is just:
item()
Select takes in the output from Filter array. It uses the following expressions for each of the fields.
split(item(), ',')[0]
split(item(), ',')[1]
split(item(), ',')[2]
split(item(), ',')[3]
split(item(), ',')[4]
split(item(), ',')[5]
//And the last field that could potentially be empty
if(equals(length(split(item(), ',')), 6), '', split(item(), ',')[6])
And finally, we have our Create HTML table which takes in the output from Select.
After running the flow, we would have the following HTML table.
Hello!
This thread has a solution that should work in your case too:
Convert .csv file into an HTML table - Power Platform Community (microsoft.com)
Hey @drewbty
You can use create html table action for this.
But it only accepts array as input.
So, do one thing, create a compose action. In that store the above.
Then make another compose in that write:
createArray(outputs('Compose'))
or if you want the headers, you can write createArray(outputs('Compose').headers)
and if you want only body you can write createArray(outputs('Compose').body)
And then pass this compose in create html table action.
This is how the flow looks like:
1st Compose:
Contains your above data
2nd compose:
used compose action and used createArray function inside it on headers.
3rd Compose:
Passed Outputs of compose 2
Output:
Or if you want everything in that html action, like headers, body and status code at the same time you will have to go in advance settings of html table action and make the columns manually, other than automatically.
If this was the solution, you were looking for you can accept this as a solution and if you liked the explanation, you can give a thumbs up. 🙂
Getting the following error when attempting this.
Do I have to select the columns manually? How do I access each one if so
I am seeing that orange arrow for the first time.
@StretchFredrik have you seen the orange arrow before or have any idea about it?
@drewbty I think the problem could be with that Power Bi Action can you expand it in run history and post the screenshot here.
And also post what you have added in that create html action in edit mode.
I think it just means that retries occured?
{"statusCode":200,"headers":{"Pragma":"no-cache","Transfer-Encoding":"chunked","Strict-Transport-Security":"max-age=31536000; includeSubDomains","X-Frame-Options":"deny","X-Content-Type-Options":"nosniff","RequestId":"4d4336f9-8172-4dbe-81c8-dcd2763f5e49","Access-Control-Expose-Headers":"RequestId","Timing-Allow-Origin":"*","x-ms-apihub-cached-response":"true","x-ms-apihub-obo":"false","Cache-Control":"no-store, must-revalidate, no-cache","Date":"Wed, 16 Nov 2022 11:57:22 GMT","Content-Type":"text/csv","Content-Length":"944082"},"body":"ClientID,Username,FirstName,Surname,ClientProfile,Channel,Affiliate\r\nxxxxx,xxxx,xxxx,xxxx,xxxxx\r\n
I've attached a sample of this first line of output here , and the screenshot below
What was the final output you were expecting to see?
Also, do you have the full JSON as the current JSON you posted isn't complete/valid. There is a missing n after \r\, and there should be 7 columns of data, but the second row only contains 6.
"ClientID,Username,FirstName,Surname,ClientProfile,Channel,Affiliate\r\n0000,0000,xxxx,xxxx,xxxx,xxxxx,xxxxx\r\0000,0000,xxxx,xxxxx,xxxxx,xxxxx\r\n
And lastly, will the fields be the same every time? (ClientID,Username,FirstName,Surname,ClientProfile,Channel,Affiliate)
Hi Grant
Expected final output is a html table, final step in the flow is create html table
I can't post the full Json as I'd have to just change all values to xxxx for integrity reasons, would take too long ( I thought it was CSV as that's the file output I selected in the flow step)
Assume there is a r\ and n\ in all the right places. Apologies but that was just a typo when I was manually blacking out the field data
The row does contain 6 - that's because the final field has no data entry for it. Only the final column field (AffiliateID) can ever be blank
Yes the columns will be the same every time
Here's a flow that will hopefully get what you're after. This assumes the only field that could be blank (as you mentioned) is the last field Affiliate.
The full flow is below. I'll go into each of the actions.
Compose JSON contains your sample JSON data (including just 6 items in the second record).
Compose Array uses the following expression. It splits the body by \r\n and skips the first row since that contains our headers and we only want the data.
skip(split(outputs('Compose_JSON')?['body'], decodeUriComponent('%0D%0A')), 1)
Filter array just removes the last item (empty items). The expression is just:
item()
Apply to each iterates over each of the items from Filter array.
Compose Split uses the following expression to split each of the items by comma. It will give us an array containing each of the items in the current row.
split(items('Apply_to_each'), ',')
Compose Object builds up an object for each row. The expressions are as follows:
outputs('Compose_Split')[0]
outputs('Compose_Split')[1]
outputs('Compose_Split')[2]
outputs('Compose_Split')[3]
outputs('Compose_Split')[4]
outputs('Compose_Split')[5]
//And the last field that could potentially be emtpy
if(equals(length(outputs('Compose_Split')), 6), '', outputs('Compose_Split')[6])
Looking at the raw code using Peek code we see:
{
"inputs": {
"Client ID": "@{outputs('Compose_Split')[0]}",
"Username": "@{outputs('Compose_Split')[1]}",
"First Name": "@{outputs('Compose_Split')[2]}",
"Surname": "@{outputs('Compose_Split')[3]}",
"Client Profile": "@{outputs('Compose_Split')[4]}",
"Channel": "@{outputs('Compose_Split')[5]}",
"Affiliate": "@{if(equals(length(outputs('Compose_Split')), 6), '', outputs('Compose_Split')[6])}"
}
}
And finally, after the Apply to each, we have our Create HTML table which takes in the output from Compose Object.
outputs('Compose_Object')
After running the flow, we would have the following HTML table.
Couple of things:
Just like to flag first that my first step is Export To File For Paginated Reports and I have selected output type as CSV, not JSON, if this matters to your flow.
The flow started running so here's hoping...
Update: 40 mins later and it's still running the 'apply to each' step so I had to cancel. I can see it's not designed for large sets of data which is fine, as there would probably just be a couple of hundred anyway in actuality
I updated the Export to File for Paginated Report parameter to just have ~200 rows and it runs through the core parts.
Now, I still get the following error though at the final step - create html table
I've tried to manually do columns by selecting them with item()?[ColumnName'] (not sure if syntax is right for this data type but worked for me before)
No luck however
To highlight something, my Compose Object line looks a little different as I couldn't put curly brackets at the top and bottom as it gave an 'enter a valid json' error.
My Compose Object sneak peak code looks like this
{
"inputs": "\"ClientID\": \"@{outputs('Compose_Split')[0]}\",\n\"Username\": \"@{outputs('Compose_Split')[1]}\",\n\"FirstName\": \"@{outputs('Compose_Split')[2]}\",\n\"Surname\": \"@{outputs('Compose_Split')[3]}\",\n\"ClientProfile\": \"@{outputs('Compose_Split')[4]}\",\n\"Channel\": \"@{outputs('Compose_Split')[5]}\",\n\"Affiliate\": \"@{if(equals(length(outputs('Compose_Split')), 6), '', outputs('Compose_Split')[6])}\"",
"metadata": {
"operationMetadataId": "3e89534f-52a2-414a-8f3b-1bb1dd468193"
}
}
Unsure if my Compose Object step is actually doing anything, as it shows the input and output the same, just for 1 row
Is this the expected format?
Hi @drewbty , It seems to me that we're missing the enclosing the Compose object inside curly brackets. Please check @grantjenkins post compose screenshot brackets highlighted below.
--------------------------------------------------------------------------------------
If I have answered your question, please mark your post as Solved.
If you like my response, please give it a Thumbs Up.
I'll see if I can get rid of that Apply to each completely. Give me a few hours.
Ok, new and improved solution without the Apply to each, or having to add the { } to build up each object. Also, much more efficient and not using up your billable actions.
Full flow is below. I'll go into each of the actions.
Compose JSON contains your sample JSON data (including just 6 items in the second record).
Compose Array uses the following expression. It splits the body by \r\n and skips the first row since that contains our headers and we only want the data.
skip(split(outputs('Compose_JSON')?['body'], decodeUriComponent('%0D%0A')), 1)
Filter array just removes the last item (empty items). The expression is just:
item()
Select takes in the output from Filter array. It uses the following expressions for each of the fields.
split(item(), ',')[0]
split(item(), ',')[1]
split(item(), ',')[2]
split(item(), ',')[3]
split(item(), ',')[4]
split(item(), ',')[5]
//And the last field that could potentially be empty
if(equals(length(split(item(), ',')), 6), '', split(item(), ',')[6])
And finally, we have our Create HTML table which takes in the output from Select.
After running the flow, we would have the following HTML table.
@grantjenkins this is next level thanks. Appreciate your help!!
Question for you. If I was to apply this across to other reports that might have empty values on other field rows (not always with the final field like with Affiliate here) - is there a way of handling that too?
It would depend on the raw data. At the moment if there is a missing value it just doesn't add it. If there were two missing values and you got 4 out of the 6 values (for example) you might get: xxxx,xxxxx,xxxxx,xxxxx. In this case, how would you know if it was the second value that was missing, or the third, etc.?
@grantjenkins seem to have missed a final comma after blacking out the data
If there is a blank on the Affiliate, it would display like this xxxx,xxxxx,xxxxx,xxxx,\r\n
If there is a blank on a field row in position2, it would be like xxxxx,,xxxxx,xxxxx
In other reports, the only other way I’ve found power automate might output the blanks like this xxxxx,”,xxxx,xxxx
So I think the blank inputs appear just 1 of just 2 ways as outputs
a double comma i.e. ,,
or a comma, single quotation marks, then a comma i.e. ,”,
Dear Community Members, We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd. During this period, members will not be able to Kudo, Comment, or Reply to any posts. On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities. What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!
We are excited to announce the Summer of Solutions Challenge! This challenge is kicking off on Monday, June 17th and will run for (4) weeks. The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions to as many questions as you can. Answers can be provided in all the communities. Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”) - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024 Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods. You must enter into each weekly Entry Period separately. How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different. Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers. At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing. Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner. A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random. Individuals will be contacted before the announcement with the opportunity to claim or deny the prize. Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners. Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event. ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE** Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber SolutionsSuper UsersNumber Solutions Deenuji 9 @NathanAlvares24 17 @Anil_g 7 @ManishSolanki 13 @eetuRobo 5 @David_MA 10 @VishnuReddy1997 5 @SpongYe 9JhonatanOB19932 (tie) @Nived_Nambiar 8 @maltie 2 (tie) @PA-Noob 2 (tie) @LukeMcG 2 (tie) @tgut03 2 (tie) Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 2: Community MembersSolutionsSuper UsersSolutionsPower Automate @Deenuji 12@ManishSolanki 19 @Anil_g 10 @NathanAlvares24 17 @VishnuReddy1997 6 @Expiscornovus 10 @Tjan 5 @Nived_Nambiar 10 @eetuRobo 3 @SudeepGhatakNZ 8 Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 3:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji32ManishSolanki55VishnuReddy199724NathanAlvares2444Anil_g22SudeepGhatakNZ40eetuRobo18Nived_Nambiar28Tjan8David_MA22 Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 4:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji11FLMike31Sayan11ManishSolanki16VishnuReddy199710creativeopinion14Akshansh-Sharma3SudeepGhatakNZ7claudiovc2CFernandes5 misc2Nived_Nambiar5 Usernametwice232rzaneti5 eetuRobo2 Anil_g2 SharonS2
On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners. The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights. Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.
We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for. We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer. Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together. Here's to new beginnings and endless possibilities! If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport. To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages