I have a flow which is getting data from Azure Automation. The result can either be a single object or multiple objects which is represented as array.
The issue I'm having is; if I use the multiple objects result to build my json schema, it only works for multiple objects and fails when the result is a single object. Likewise, if I switch the schema to a single object same; it fails whenever I get results in form of array.
Is there any good way to dynamically switch the schema between object and array ?
ARRAY SCHEMA
{
"type": "array",
"items": {
"type": "object",
"properties": {
"From": {
"type": "string"
},
"TO": {
"type": "string"
},
"Subject": {
"type": "string"
},
"IsRead": {
"type": "string"
},
"hasAttachments": {
"type": "string"
},
"RecipientType": {
"type": "string"
},
"RecallStatus": {
"type": "string"
},
"SentTime-UTC": {
"type": "string"
},
"Error": {
"type": "string"
}
},
"required": [
"From",
"TO",
"Subject",
"IsRead",
"hasAttachments",
"RecipientType",
"RecallStatus",
"SentTime-UTC",
"Error"
]
}
}
Example of SINGLE OBJECT result from Azure Automation
{
"From": "jack@domain.com",
"TO": "don@domain.com",
"Subject": "tester",
"IsRead": "False",
"hasAttachments": "False",
"RecipientType": "domainuser",
"RecallStatus": "SUCCESS",
"SentTime-UTC": "7/31/2021 6:56 PM UTC",
"Error": ""
}
Example of multiple object result (array) from Azure Automation
[
{
"From": "don@domain.com",
"TO": "jack@domain.com",
"Subject": "natsu",
"IsRead": "False",
"hasAttachments": "False",
"RecipientType": "domainuser",
"RecallStatus": "SUCCESS",
"SentTime-UTC": "7/31/2021 5:28 PM UTC",
"Error": ""
},
{
"From": "don@domain.com",
"TO": "Team@domain.com",
"Subject": "natsu",
"IsRead": "False",
"hasAttachments": "False",
"RecipientType": "domainuser",
"RecallStatus": "SUCCESS",
"SentTime-UTC": "7/31/2021 5:28 PM UTC",
"Error": ""
},
{
"From": "don@domain.com",
"TO": "alan@domain.com",
"Subject": "natsu",
"IsRead": "False",
"hasAttachments": "False",
"RecipientType": "domainuser",
"RecallStatus": "SUCCESS",
"SentTime-UTC": "7/31/2021 5:28 PM UTC",
"Error": ""
}
]
Thanks for your help..
T
Solved! Go to Solution.
If you know that those are the only two possible schemas, you can have two Parse JSON blocks, one for each schema, hook them to same content for 'Content' field, then use 'Configure Run After' and change it so it is set to when the previous block 'has failed' to try the schema version for a single object 'Parse JSON 2' in case the first one fails for multiple objects schema encountering a single object, and then it will use that fallback block for the single object instead and work correctly.
Here I am using a Compose block, to simulate the input of a single object returned by Azure, at the top of the Flow:
You would then do something like this:
(by the way to access the above options you must click the Ellipses on the upper right of 'Parse JSON 2' and click 'Configure Run After')
Then in the 'Parse JSON 2 should run after' screen, you should uncheck 'is successful', then check 'has failed' and then, click 'Done'.
After you do this correctly,
There should be a red arrow with dashed line, like this:
Note - the schema to use for single object (for the second Parse JSON block) is:
{
"type": "object",
"properties": {
"From": {
"type": "string"
},
"TO": {
"type": "string"
},
"Subject": {
"type": "string"
},
"IsRead": {
"type": "string"
},
"hasAttachments": {
"type": "string"
},
"RecipientType": {
"type": "string"
},
"RecallStatus": {
"type": "string"
},
"SentTime-UTC": {
"type": "string"
},
"Error": {
"type": "string"
}
}
}
I noticed it was not in your original post, so in case, this is the schema to use for the single object off-case.
Then when testing with the single object input, it is successful - failing over from the block expecting an Array, to the other one expecting a single object as you can tell from the green checkmark on the second block below:
The overall result of the flow is successful despite the error - this is because the error is handled using the 'configure run after'
When doing this, you might find that it may seem to be necessary to have the apply to each logic duplicated again. In order to avoid this, you could place the logic for processing each specific individual item, into a child flow, for example, which will then avoid this duplication.
If you know that those are the only two possible schemas, you can have two Parse JSON blocks, one for each schema, hook them to same content for 'Content' field, then use 'Configure Run After' and change it so it is set to when the previous block 'has failed' to try the schema version for a single object 'Parse JSON 2' in case the first one fails for multiple objects schema encountering a single object, and then it will use that fallback block for the single object instead and work correctly.
Here I am using a Compose block, to simulate the input of a single object returned by Azure, at the top of the Flow:
You would then do something like this:
(by the way to access the above options you must click the Ellipses on the upper right of 'Parse JSON 2' and click 'Configure Run After')
Then in the 'Parse JSON 2 should run after' screen, you should uncheck 'is successful', then check 'has failed' and then, click 'Done'.
After you do this correctly,
There should be a red arrow with dashed line, like this:
Note - the schema to use for single object (for the second Parse JSON block) is:
{
"type": "object",
"properties": {
"From": {
"type": "string"
},
"TO": {
"type": "string"
},
"Subject": {
"type": "string"
},
"IsRead": {
"type": "string"
},
"hasAttachments": {
"type": "string"
},
"RecipientType": {
"type": "string"
},
"RecallStatus": {
"type": "string"
},
"SentTime-UTC": {
"type": "string"
},
"Error": {
"type": "string"
}
}
}
I noticed it was not in your original post, so in case, this is the schema to use for the single object off-case.
Then when testing with the single object input, it is successful - failing over from the block expecting an Array, to the other one expecting a single object as you can tell from the green checkmark on the second block below:
The overall result of the flow is successful despite the error - this is because the error is handled using the 'configure run after'
When doing this, you might find that it may seem to be necessary to have the apply to each logic duplicated again. In order to avoid this, you could place the logic for processing each specific individual item, into a child flow, for example, which will then avoid this duplication.
@Anonymous
Or if you prefer a solution with just the one Parse JSON block, it is possible, here is a potentially smarter solution:
In the above, issue a "length" expression compose which will fail if it is an object. In this case, you can concat '[' and ']' to the input in the middle, and use that concat result as the input for Parse JSON, and it will work, I tested it.
In order to do this, you would probably need to initialize and set a variable that is an Array. You would set the variable to the exact input if the length call succeeds (because it is already correct) - but if the length call fails, you would do the concatenation and set that to the variable.
In order for it to work correctly with variables, you might need to wrap that concat in an array expression to explicitly convert it to an array:
array(concat('[',outputs('Compose'),']'))
I did not specifically test this way with the variables, as it is just a quick alternative that came to my mind that you might like better in case.
Thanks guys for the lightning speed response. I went with the first suggestion. I had to make some other modifications downstream cos of the two variables situation it created but got everything working now. When things settle down, I'll try the more efficient method
@Anonymous
Wanted to give you some more detail about the "smarter method" as it is actually bit more involved to set up - but it could be worth it in case you prefer it, as you would then have only one Parse JSON instead of two. I would recommend you test it though to be sure it works correctly for you.
Here is how you might do it in case you wanted only one Parse JSON action instead of two:
Assuming you have this hard coded input block to test with:
You would basically start with wherever you put your Configure Run After to failed to make the first method to work, and add a parallel branch to add one that runs on success. This would be the branch that runs if the length call does not fail. Here is how to add a parallel branch - click the plus sign in between the steps (in your case, where the red arrow with dashed line is) and then click on "Add a parallel branch", like this:
I'll go ahead and show you how a Flow I tested, which is working for both the single object input and array input with only one Parse JSON action, looks like, and highlight some things to pay attention to following these images:
Then at the very bottom, the Parse JSON action:
Here is how the Flow looks at overall level, above. Following images shows values for each expression:
-
-
For the converging step (Compose 3 in this example) it is very important to have these settings in configure run after, make sure to click on both of the functions in the Configure Run After menu to set both checkboxes. So you should get something that looks like the following in image below if done correctly:
Notice how "Succeeded, Skipped' shows up two times, one for 'Set Variable' and once for 'Set Variable 2' - it must show up twice for it to work correctly. Click on each purple box to set the values for each one. This step should be checked carefully. If not done correctly, one or both branches will never get to the Parse JSON step (so either the Object, or Array, or both, will never get to the Parse JSON step) unless this is done correctly.
If done correctly, here is a test run with the single object version of the Input:
- Notice in the above, it works with the Array Parse JSON, even if the input was just a single object .
And here is a test run for the multi object (Array) version:
You might wonder why is variable String type. When I tested with Array type and used the array formula as I conjectured in the quick post, actually this did not work, that actually wrapped the string inside the array, and did not convert the string to an array . Instead, it worked when I leave this array function out and not do as the quick reply I made conjectured, and instead just have it as a String variable type. The Array branch on the right side will auto convert to string just when setting the variable. The left side produces a String, so this is why the variable should be a String.
As for Parse JSON, it will be able to parse it correctly even if it is a serialized string of JSON rather than an actual Array. I tested by adding an Apply to Each at the end of the Flow on the 'Body' of the Parse JSON, and tested with two elements as the input. It did do two iterations - so it should work.
I tested it and this works, and with this way it is possible to use just one Parse JSON. The one Parse JSON I use in this example is the one for the array schema, by the way. The Flow will leave the stuff intact if it is already an Array (but will convert it to a string - which Parse JSON should still understand correctly by the way). If it is not an array, it will use the concat trick and then save that as a string, and that should also work.
The necessity of it being a String is because we can have variable of only one type in Power Automate. It will work because the Array auto converts to a string on the right side, which is nice. On the left side, that does produce a String. What's also nice is that Parse JSON action block understands a serialized JSON string as the input, which also makes this able to be possible.
I would test just in case if this auto conversion to and from String that Power Automate does, ever hits any issue on some specific data. Using this functionality though, I never encountered any issue to date that could be attributed to Power Automate turning into a string and Parse JSON turning it back into an Array of Objects, so that functionality is probably working very well. If you are feeling unsure I recommend that you test it in case when using this method.
Check if the above helps and if it results in you needing only one Parse JSON instead of two. In case you like the solution you already implemented based on my first post better, you could continue to use that as well.
Dear Community Members, We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd. During this period, members will not be able to Kudo, Comment, or Reply to any posts. On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities. What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!
We are excited to announce the Summer of Solutions Challenge! This challenge is kicking off on Monday, June 17th and will run for (4) weeks. The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions to as many questions as you can. Answers can be provided in all the communities. Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”) - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024 Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods. You must enter into each weekly Entry Period separately. How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different. Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers. At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing. Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner. A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random. Individuals will be contacted before the announcement with the opportunity to claim or deny the prize. Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners. Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event. ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE** Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber SolutionsSuper UsersNumber Solutions Deenuji 9 @NathanAlvares24 17 @Anil_g 7 @ManishSolanki 13 @eetuRobo 5 @David_MA 10 @VishnuReddy1997 5 @SpongYe 9JhonatanOB19932 (tie) @Nived_Nambiar 8 @maltie 2 (tie) @PA-Noob 2 (tie) @LukeMcG 2 (tie) @tgut03 2 (tie) Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 2: Community MembersSolutionsSuper UsersSolutionsPower Automate @Deenuji 12@ManishSolanki 19 @Anil_g 10 @NathanAlvares24 17 @VishnuReddy1997 6 @Expiscornovus 10 @Tjan 5 @Nived_Nambiar 10 @eetuRobo 3 @SudeepGhatakNZ 8 Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 3:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji32ManishSolanki55VishnuReddy199724NathanAlvares2444Anil_g22SudeepGhatakNZ40eetuRobo18Nived_Nambiar28Tjan8David_MA22 Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 4:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji11FLMike31Sayan11ManishSolanki16VishnuReddy199710creativeopinion14Akshansh-Sharma3SudeepGhatakNZ7claudiovc2CFernandes5 misc2Nived_Nambiar5 Usernametwice232rzaneti5 eetuRobo2 Anil_g2 SharonS2
On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners. The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights. Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.
We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for. We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer. Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together. Here's to new beginnings and endless possibilities! If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport. To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages