cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Robin-Cr
Frequent Visitor

Parse Json multi nested

Hi!

 

I am having some trouble with converting my HTTP get connector to a working JSON. The ultimate goal is to store the data in a database. 

 

The raw data looks like:

{
    "2023-01-01 00:15+01:00": [
        {
            "unit""kWh",
            "type""E",
            "rate""low",
            "direction""consumption",
            "value"30.0
        },
        {
            "unit""kWh",
            "type""E",
            "rate""normal",
            "direction""consumption",
            "value"0
        }
    ],
    "2023-01-01 00:30+01:00": [
        {
            "unit""kWh",
            "type""E",
            "rate""low",
            "direction""consumption",
            "value"31.0
        },
        {
            "unit""kWh",
            "type""E",
            "rate""normal",
            "direction""consumption",
            "value"0
        }
    ],
    "2023-01-01 00:45+01:00": [
        {
            "unit""kWh",
            "type""E",
            "rate""low",
            "direction""consumption",
            "value"32.0
        },
        {
            "unit""kWh",
            "type""E",
            "rate""normal",
            "direction""consumption",
            "value"0
        }
    ]
}
 
I have tried a few things, however i keep getting errors saying the values return a NULL or not in a valid array. In this case each date has his own values. It has 2 x the same values. One for normal rate and one of Low rates. 
I am really only looking for the one that is filled. So if the low value is filled insert that in a database, if normal is filled fill that one in the database. However i am already having issues with insert all of it in the database. 
 
Does anyone have a solution to parse the above values in a database?
 
Thanks in advance.
1 ACCEPTED SOLUTION

Accepted Solutions

Attempt number 20 😉.

 

See full flow below. I'll go into each of the actions.

grantjenkins_0-1676638890875.png

 

JSON is a Compose that contains your data.

grantjenkins_1-1676638920351.png

 

Select uses the output from JSON and uses the following expressions:

//From
split(replace(string(outputs('JSON')), ']', '['), '[')

//Map
item()

grantjenkins_2-1676638999090.png

 

Filter array uses the output from Select and uses the following expression.

item()

/The actual filter expression is:
@not(startsWith(item(), '}'))

grantjenkins_3-1676639082778.png

 

Initialize variable Items creates an Array variable called items that will eventually contain the items we want.

grantjenkins_4-1676639127215.png

 

Apply to each iterates over the items in our Filter array in chunks of 2 (2 items at a time). The first item is the date and the second item is the actual objects.

chunk(body('Filter_array'), 2)

grantjenkins_5-1676639214059.png

 

Compose uses the following expression to get the objects in JSON format.

json(concat('{"values":[', last(item()), ']}'))

grantjenkins_6-1676639284271.png

 

Filter array Non Zero filters our data so only the items where value is greater than zero are kept.

//From
outputs('Compose')?['values']

//Filter
item()?['value']

//The full filter expression is:
@greater(item()?['value'], 0)

grantjenkins_7-1676639396804.png

 

Append to array variable Items adds the object to the array while also adding the date property. Because the dates are ISO dates, we can safely assume the length will always be the same regardless of date and time.

addProperty(first(body('Filter_array_Non_Zero')), 'date', slice(first(item()), 2, 24))

grantjenkins_8-1676639470582.png

 

That should give you all the objects within the Items array variable. After the Apply to each, I've added another Compose (Output) that just shows the contents of the items array.

grantjenkins_9-1676639577279.png

 

And the final output:

[
  {
    "unit": "kWh",
    "type": "E",
    "rate": "normal",
    "direction": "consumption",
    "value": 30,
    "date": "2023-01-01 00:15+01:00"
  },
  {
    "unit": "kWh",
    "type": "E",
    "rate": "low",
    "direction": "consumption",
    "value": 31,
    "date": "2023-01-01 00:30+01:00"
  },
  {
    "unit": "kWh",
    "type": "E",
    "rate": "low",
    "direction": "consumption",
    "value": 32,
    "date": "2023-01-01 00:45+01:00"
  }
]

grantjenkins_10-1676639628320.png

 


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

View solution in original post

21 REPLIES 21

What would you be looking to add to your database? Did you want the date and the value, or are you just adding all the values, or all properties within the value that has a value?

 

As an example, which of these are you looking to get out of your JSON data?

[
    30.0,
    31.0,
    32.0
]

 

Or this:

[
    {
        "date": "2023-01-01 00:15+01:00",
        "value": 30.0
    },
    {
        "date": "2023-01-01 00:30+01:00",
        "value": 31.0
    },
    {
        "date": "2023-01-01 00:45+01:00",
        "value": 32.0
    }
]

 

Or this:

[
    {
        "date": "2023-01-01 00:15+01:00",
        "unit": "kWh",
        "type": "E",
        "rate": "low",
        "direction": "consumption",
        "value": 30.0
    },
    {
        "date": "2023-01-01 00:30+01:00",
        "unit": "kWh",
        "type": "E",
        "rate": "low",
        "direction": "consumption",
        "value": 31.0
    },
    {
        "date": "2023-01-01 00:45+01:00",
        "unit": "kWh",
        "type": "E",
        "rate": "low",
        "direction": "consumption",
        "value": 32.0
    }
]

 

Or something else?


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

Hi thanks for taking time responding.

Looking to add a mix between the 2nd and last, with the exepction that if the value of normal rate is filled i want to add those values. The data always only has a normal or low rate per 15 min. For example:

[
    {
        "date": "2023-01-01 00:15+01:00",
        "rate": "low",
        "value": 30.0
    },
    {
        "date": "2023-01-01 00:30+01:00",
        "rate": "normal",
        "value": 31.0
    },
    {
        "date": "2023-01-01 00:45+01:00",
        "rate": "low",
        "value": 32.0
    }
]

If it is an impossible task to switch between the two rates i would also be happy with:

[
    {
        "date": "2023-01-01 00:15+01:00",
        "rate": "low",
        "value": 30.0
    },
    {
        "date": "2023-01-01 00:15+01:00",
        "rate": "normal",
        "value": 0
    },
    {
        "date": "2023-01-01 00:30+01:00",
        "rate": "low",
        "value": 0
    },
    {
        "date": "2023-01-01 00:30+01:00",
        "rate": "normal",
        "value": 31.0
    },
    {
        "date": "2023-01-01 00:45+01:00",
        "rate": "low",
        "value": 32.0
    },
    {
        "date": "2023-01-01 00:45+01:00",
        "rate": "normal",
        "value": 0
    }
]

 However it would be an added bonus to switch between the two, considering that halves our data. Thanks!

Hopefully this is what you're looking for. It was much more challenging than I first thought. I was planning on taking the data, converting to XML, then running some XPath expressions. However, the data (dates) isn't in a valid format for converting to XML, so had to make some changes to make it valid, then reverse at the end. You'll need to do some testing to ensure it works with your data, but hopefully works as expected.

 

See full flow below. I'll go into each of the actions.

grantjenkins_0-1675933747107.png

 

Initialize variable creates a variable of type string called data. This contains the data that you would get from your HTTP connector.

grantjenkins_1-1675933856346.png

 

Select splits the data on new line character as input and applies some changes to the date property names so it's valid for converting to XML. The expressions used are.

//From
//You would pass in your data from your HTTP connector instead of the variable that I have
split(variables('data'), decodeUriComponent('%0A'))

//Map
//We're replacing + with --- and : with xxx (except for the last :
if(endsWith(item(), ': ['),
    replace(replace(replace(item(), '+', '---'), ':', 'xxx'), 'xxx ', ':'),
    item()
)

grantjenkins_2-1675934019190.png

 

XML is a Compose that converts the data from our Select into XML. The expression used is:

//It joins the array items back to a string, then converts to XML
xml(json(concat('{"root": { value:', json(join(body('Select'), decodeUriComponent('%0A'))), '}}')))

grantjenkins_5-1675934590341.png

 

Select Output uses XPath on the XML output to extract out only the items where the value is greater than zero. See expressions below:

//From
xpath(outputs('XML'), '//root/value/*[value/text() > "0"]')

//Date
//We revert our original changes to the date
replace(replace(replace(replace(xpath(item(), 'name(//*)'), '---', '+'), 'xxx', ':'), '_x0020_', ' '), '_x0032_', '2')

//Rate
xpath(item(), 'string(//rate/text())')

//Value
xpath(item(), 'number(//value/text())')

grantjenkins_3-1675934250652.png

 

The final output after running the flow is below:

[
  {
    "Date": "2023-01-01 00:15+01:00",
    "Rate": "low",
    "Value": 30
  },
  {
    "Date": "2023-01-01 00:30+01:00",
    "Rate": "low",
    "Value": 31
  },
  {
    "Date": "2023-01-01 00:45+01:00",
    "Rate": "low",
    "Value": 32
  }
]

grantjenkins_6-1675934710873.png


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
Robin-Cr
Frequent Visitor

Thanks! The last bit works great. When i have a static varriable the data is succesfully loaded in a database. However i am having some issues when using the HTTP connector instead.

I recon it has to do with the \`s in the post. But when trying to replace them i get different erorrs. You have any idea how solve this issue?

See attached text document. (couldnt upload txt so i zipped it)

 

Thanks again.

Are you able to wrap your HTTP output into a json expression to see if that fixes it?


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

That is something i have tried. The first bit of the result i input into the split will then look like:

{"body":{"2023-01-01 00:15+01:00":[{"unit":"kWh","type":"E","rate":"low","direction":"consumption","value":30.0},{"unit":"kWh","type":"E","rate":"normal","direction":"consumption","value":0}],"2023-01-01 00:30+01:00":[{"unit":"kWh","type":"E","rate":"low","direction":"consumption","value":31.0},{"unit":"kWh","type":"E","rate":"normal","direction":"consumption","value":0}],"2023-01-01 00:45+01:00":[{"unit":"kWh","type":"E","rate":"low","direction":"consumption","value":32.0},{"unit":"kWh","type":"E","rate":"normal","direction":"consumption","value":0}],"2023-01-01 01:00+01:00":[{"unit":"kWh","type":"E","rate":"low","direction":"consumption","value":31.0},{"unit":"kWh","type":"E","rate":"normal","direction":"consumption","value":0}],"2023-01-01 01:15+01:00":[{"unit":"kWh","type":"E","rate":"low","direction":"consumption","value":29.0},{"unit":"kWh","type":"E","rate":"normal","direction":"consumption","value":0}],"2023-01-01 01:30+01:00":[{"unit":"kWh","type":"E","rate":"low","direction":"consumption","value":31.0},

 This does look alot like the required conversion. However the split then gives me the following error:

Unable to process template language expressions in action 'Select' inputs at line '0' and column '0': 'The template language function 'split' expects its first parameter to be of type string. The provided value is of type 'Object'. Please see https://aka.ms/logicexpressions#split for usage details.'.

 

Thats why i tried putting into a varriable string. However this then generates the \.

That data looks much better. All you would need to do is wrap the inner part of the split into a string expression, so it converts your JSON object to a string.

 

split(string(YOUR_DATA), decodeUriComponent('%0A'))

 


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
Robin-Cr
Frequent Visitor

 

It looks like this now:

RobinCr_1-1675950689690.png

The split:

split(string(body('Parse_HTTP')), decodeUriComponent('%0A'))

I get the following error:

RobinCr_2-1675950746983.png

 

 

 

I'm having a bit of trouble splitting this JSON data ☹️. I'm just off to sleep now (midnight for me), but will have another look tomorrow. Might need to go with a different approach.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

Are you able to show some of the output from your Select to see if it's split the data correctly. Just a quick screenshot of the Select output would be fine.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
Robin-Cr
Frequent Visitor

Output of the Parse:

RobinCr_1-1675951341662.png

Output of the select:

RobinCr_0-1675951308668.png

 

Robin-Cr
Frequent Visitor

@grantjenkins Have you had time to look at this issue? It still is something i am having issues with.

@Robin-Cr I think I’ve got it working now. Had to go with a slightly different approach. Just off to bed now (midnight for me) but will finish it off and post to you when I get up.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

Hopefully this will work for you 🙂

 

See full flow below. I'll go into each of the actions.

grantjenkins_0-1676530652432.png

 

JSON is a Compose that contains your data.

grantjenkins_1-1676530695276.png

 

Select Stage 1 uses the following expressions to extract out the data into an array. This is the first stage of transforming the data.

//From
split(replace(string(outputs('JSON')), ']', '['), '[')

//Map
item()

grantjenkins_2-1676530780003.png

 

Filter array uses the output from Select Stage 1 and filters out any items that don't contain the word 'unit'. The expression used here is:

item()

grantjenkins_3-1676530879944.png

 

Select Stage 2 uses the output from Filter array and transforms the items into proper JSON, but into nested arrays which we will sort out in the next actions. The expression used is:

//Map
json(concat('{"values":[', item(), ']}'))

grantjenkins_4-1676530965092.png

 

XML is a Compose that converts the output from Select Stage 2 to XML. The expression used is:

xml(json(concat('{"root": { items:', body('Select_Stage_2'), '}}')))

grantjenkins_5-1676531029402.png

 

Select Final uses the output from XML, using some XPath to extract out only the items where the value is greater than 0.

//From
xpath(outputs('XML'), '//root/items/values[value > 0]')

//Map
json(item())?['values']

grantjenkins_6-1676531111015.png

 

After running the flow now, we should get the following output.

[
  {
    "unit": "kWh",
    "type": "E",
    "rate": "normal",
    "direction": "consumption",
    "value": "30"
  },
  {
    "unit": "kWh",
    "type": "E",
    "rate": "low",
    "direction": "consumption",
    "value": "31"
  },
  {
    "unit": "kWh",
    "type": "E",
    "rate": "low",
    "direction": "consumption",
    "value": "32"
  }
]

grantjenkins_7-1676531174290.png

 

You can then use the output from Select Final to hopefully get what you're after.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

Thanks for your response, this indeed seems to work. However i am missing the date field. Any way to get this field in the final stage?

Ahhhhh I completely forgot about the date 😮

 

I'll see what I can do.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

Will there always be exactly two items under each date (normal and low)?


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

Yes there is.

Great - should be able to get the date fairly easily then. Give me 24 hours - just off to sleep then crazy busy day at work tomorrow.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!    This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Community MembersNumber SolutionsSuper UsersNumber Solutions Deenuji 9 @NathanAlvares24  17 @Anil_g  7 @ManishSolanki  13 @eetuRobo  5 @David_MA  10 @VishnuReddy1997  5 @SpongYe  9JhonatanOB19932 (tie) @Nived_Nambiar  8 @maltie  2 (tie)   @PA-Noob  2 (tie)   @LukeMcG  2 (tie)   @tgut03  2 (tie)       Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 2: Community MembersSolutionsSuper UsersSolutionsPower Automate  @Deenuji  12@ManishSolanki 19 @Anil_g  10 @NathanAlvares24  17 @VishnuReddy1997  6 @Expiscornovus  10 @Tjan  5 @Nived_Nambiar  10 @eetuRobo  3 @SudeepGhatakNZ 8     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 3:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji32ManishSolanki55VishnuReddy199724NathanAlvares2444Anil_g22SudeepGhatakNZ40eetuRobo18Nived_Nambiar28Tjan8David_MA22   Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 4:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji11FLMike31Sayan11ManishSolanki16VishnuReddy199710creativeopinion14Akshansh-Sharma3SudeepGhatakNZ7claudiovc2CFernandes5 misc2Nived_Nambiar5 Usernametwice232rzaneti5 eetuRobo2   Anil_g2   SharonS2  

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages

Users online (788)