cancel
Showing results for 
Search instead for 
Did you mean: 
Reply

Dynamic Data Source and Power Query syntax

Hi All!
So, currently I was working on fixing a dynamic data source issue and auto-refresh using the power bi service. Now, below is the original query that was used for calling the data from the dynamic source.

SDM_1997_0-1645084802145.png

Now, I have changed the query using Web.Contents and RelativePath and that is working as well. I am not getting dynamic data source error in Power Bi service as well. Below is the new query.

SDM_1997_1-1645085037076.png

Now my doubt is, in all the different forums and also Chris Webb's blogs I have checked, nowhere they have used 'Headers" inside Web.Contents.
But if I don't use the headers, then I am getting an error called "We Found Extra Characters at End of JSON Line". Only by using headers, I am not getting any error.
So, I wanted to know if I am doing anything wrong or that Headers may not be required by doing some other changes?
Also I am asking this because in the old query using 'Odata.feed', data was being returned as 'Table' which gave all the data columns by just expanding once.
But here, data is returned as 'Records' and I have to expand 3 times (1st Record expands to a column of lists -> 2nd that column expands to a column of records again -> 3rd time expands into all the data columns).
This is why I wanted to know if I am doing anything incorrect?


Thanks in advanced.. 😁

1 ACCEPTED SOLUTION

Accepted Solutions

Web.Contents with the api is my preferred method of accessing SharePoint data as you typically get way better performance compared to the OOTB SharePoint connectors and OData.Feed, especially when dealing with libraries/lists with lookups. To answer your questions in order:

Q: ... So, I wanted to know if I am doing anything wrong or that Headers may not be required by doing some other changes?
A: You want to always include the header as you have it to tell the api to return data in JSON format. Otherwise, it returns data in XML format by default, which is much more annoying to work with in Power Query. FYI, probably the reason you get an error when you don't specify Header/accept is because you still have Json.Document in your next step, trying to parse XML, which is resulting in an error.

Q: But here, data is returned as 'Records'[?]
A: The way that Json.Document parses the data is to treat it like one giant nested record, which makes sense if you look at JSON syntax, which is basically a bunch of field/value pairings where a value can be a single value, an array of values, or an array of additional field/value pairings. A typical SharePoint list or library items api reponse in JSON would look like:

 

 

{
    **one or two query-specific odata field/values**,
    "value": [
        {
            **A few item-specific odata field/values**,
            "Id": 1,
            "Title": "Title1",
            "ID": 1
        },
        {
            **A few item-specific odata field/values**,
            "Id": 2,
            "Title": "Title2",
            "ID": 2
        },
        {
            **A few item-specific odata field/values**,
            "Id": 3,
            "Title": "Title3",
            "ID": 3
        }
    ]
}

 

 

So, as you can see from above, the main data (list items) appear as an array of records in the top-level "value" field.
Thus, if your Source is the Web.Contents wrapped in Json.Document, Source[value] should give you a list of records like the below:

MarkLaf_0-1645583161143.png

Q: ... and I have to expand 3 times[?]
A: Not exactly sure what kind of operations you are doing when you "expand 3 times", but in almost any context, the best method in my experience to handle converting a list of records to a table is Table.FromRecords. Again, if your Source step is the Json.Document/Web.Contents formula, and columns you include in your select argument in the query are not yet finalized, your next step should look something like:

 

 

= Table.FromRecords( Source[value], null, MissingField.UseNull )

 

 

And once you have finalized what columns you are querying, I've found it to be a good practice to define your expected table schema, e.g. if your query select is "Id,Title":

 

 

= Table.FromRecords( Source[value], type table [Id=nullable number, Title=nullable text], MissingField.UseNull )

 

 

And before you ask, yes you can specify complex fields when setting the schema. E.g. example with Author (Created By) included where we specified Author/Title, Author/EMail, and Author/JobTitle in the select:

 

 

= Table.FromRecords( Source[value], type table [Id=nullable number, Title=nullable text, Author=nullable [Title=text,EMail=text,JobTitle=text] ], MissingField.UseNull )

 

 

I'll actually split up my complex field definitions, defining before a full table definition, then put it all together in the Table.FromRecords parsing - here is a full M advanced editor example - note the extra work needed for multiselects:

 

 

let
    Source = Json.Document( Web.Contents(
        "[site url]",
        [
            RelativePath = "_api/web/lists(guid'[list guid]')/items",
            Headers = [accept="application/json"],
            Query = [
                #"$select"="Id,Title,Created,Author/Title,Author/EMail,Author/JobTitle,MultiSelectLookup/Title",
                #"$expand"="Author,MultiSelectLookup",
                #"$top"="5000"
            ]
        ]
    )),
    AuthorSchema = type nullable [Title=text,EMail=text,JobTitle=text],
    MultiSelectLookupSchema = type table [Title=nullable text],
    TableSchema = type table [
        Id=nullable Int64.Type, Title=nullable text, 
        Created = nullable text, Author=AuthorSchema, 
        MultiSelectLookup=nullable list 
    ],
    ParseData = Table.FromRecords( Source[value], TableSchema , MissingField.UseNull ),
    ParseTableCols = Table.TransformColumns( 
        ParseData, 
        {{
            "MultiSelectLookup", 
            each Table.FromRecords(_, MultiSelectLookupSchema, MissingField.UseNull), 
            MultiSelectLookupSchema
        }} 
    ),
    TransformTypes = Table.TransformColumnTypes(ParseTableCols,{{"Created", type datetime}})
in
    TransformTypes

 

 

Additional notes:

  1. As you can see from my above example, I prefer using guid's rather than titles given that titles can change. You can extract the guid from the url of the list settings page in SharePoint, or query _api/web/lists to get all the lists and their attributes including guid (Id field)
  2. It's not true for some types of queries through the SharePoint api (e.g. querying doc library folder), but definitely when dealing with lists you'll need to factor paging into your query. The api only returns 100 items by default, with a max allowance in one web call of 5000 items if you specify with $top in the query. That means, if you need to return >5000 items, you'll need to do multiple calls.
  3. Re: setting schema, note that most simple values come in as either text, number, or boolean (and pretty sure nothing else). E.g. dates will come in as text in the 'YYYY-MM-DDTHH:NN:SSZ' format. It's still useful IMO to set the schema because a) saves you trouble of removing the extra ID column that always comes in, b) puts columns in the order you specify, and c) allows you to set the type and only use Table.TransformColumnTypes on columns that actually require transformation (okay, last point only applies for the pedantic among us)
  4. Multiselect fields usually show up as lists of records, which, as discussed above, Table.FromRecords is most effective at parsing. You'll initialize these as a list column, and then can take an additional step to transform into the correctly typed table column
  5. In a lot of cases you can actually forgo all the nullable / MissingField.UseNull, but I've found accounting for missing fields makes the query more durable over time.

Edit: minor grammar fixes, trying to fix whitespace, removed application/json;odata=nometadata code snippet I decided not to comment on

View solution in original post

12 REPLIES 12

It may be that the site expects that header value, and returns something different (such as a non-json error message) if you don't provide it. To see what the site is returning, you can wrap the call to Web.Contents in Text.FromBinary instead of Json.Document. This will return the raw text, which should help you determine what's going awry.

I see.. Let me check this once and will let you know..
Also since, I am a complete newbie in Power Bi and Power Query, I wanted to confirm that the three times expansion of records is completely normal in this case then right? Since, Odata.feed's table output needed only 1 expansion?

Thanks for the reply..

Hi! I tried wrapping Web.Contents in Text.FromBinary
So, for every column of data I am getting this as the output there. 

SDM_1997_0-1645437025080.png

Can this explain what type of output the site is giving?
Thanks in advance!

That looks like xml/OData, not json. Any reason why you switched your original query from OData.Feed to Json.Document?

Web.Contents with the api is my preferred method of accessing SharePoint data as you typically get way better performance compared to the OOTB SharePoint connectors and OData.Feed, especially when dealing with libraries/lists with lookups. To answer your questions in order:

Q: ... So, I wanted to know if I am doing anything wrong or that Headers may not be required by doing some other changes?
A: You want to always include the header as you have it to tell the api to return data in JSON format. Otherwise, it returns data in XML format by default, which is much more annoying to work with in Power Query. FYI, probably the reason you get an error when you don't specify Header/accept is because you still have Json.Document in your next step, trying to parse XML, which is resulting in an error.

Q: But here, data is returned as 'Records'[?]
A: The way that Json.Document parses the data is to treat it like one giant nested record, which makes sense if you look at JSON syntax, which is basically a bunch of field/value pairings where a value can be a single value, an array of values, or an array of additional field/value pairings. A typical SharePoint list or library items api reponse in JSON would look like:

 

 

{
    **one or two query-specific odata field/values**,
    "value": [
        {
            **A few item-specific odata field/values**,
            "Id": 1,
            "Title": "Title1",
            "ID": 1
        },
        {
            **A few item-specific odata field/values**,
            "Id": 2,
            "Title": "Title2",
            "ID": 2
        },
        {
            **A few item-specific odata field/values**,
            "Id": 3,
            "Title": "Title3",
            "ID": 3
        }
    ]
}

 

 

So, as you can see from above, the main data (list items) appear as an array of records in the top-level "value" field.
Thus, if your Source is the Web.Contents wrapped in Json.Document, Source[value] should give you a list of records like the below:

MarkLaf_0-1645583161143.png

Q: ... and I have to expand 3 times[?]
A: Not exactly sure what kind of operations you are doing when you "expand 3 times", but in almost any context, the best method in my experience to handle converting a list of records to a table is Table.FromRecords. Again, if your Source step is the Json.Document/Web.Contents formula, and columns you include in your select argument in the query are not yet finalized, your next step should look something like:

 

 

= Table.FromRecords( Source[value], null, MissingField.UseNull )

 

 

And once you have finalized what columns you are querying, I've found it to be a good practice to define your expected table schema, e.g. if your query select is "Id,Title":

 

 

= Table.FromRecords( Source[value], type table [Id=nullable number, Title=nullable text], MissingField.UseNull )

 

 

And before you ask, yes you can specify complex fields when setting the schema. E.g. example with Author (Created By) included where we specified Author/Title, Author/EMail, and Author/JobTitle in the select:

 

 

= Table.FromRecords( Source[value], type table [Id=nullable number, Title=nullable text, Author=nullable [Title=text,EMail=text,JobTitle=text] ], MissingField.UseNull )

 

 

I'll actually split up my complex field definitions, defining before a full table definition, then put it all together in the Table.FromRecords parsing - here is a full M advanced editor example - note the extra work needed for multiselects:

 

 

let
    Source = Json.Document( Web.Contents(
        "[site url]",
        [
            RelativePath = "_api/web/lists(guid'[list guid]')/items",
            Headers = [accept="application/json"],
            Query = [
                #"$select"="Id,Title,Created,Author/Title,Author/EMail,Author/JobTitle,MultiSelectLookup/Title",
                #"$expand"="Author,MultiSelectLookup",
                #"$top"="5000"
            ]
        ]
    )),
    AuthorSchema = type nullable [Title=text,EMail=text,JobTitle=text],
    MultiSelectLookupSchema = type table [Title=nullable text],
    TableSchema = type table [
        Id=nullable Int64.Type, Title=nullable text, 
        Created = nullable text, Author=AuthorSchema, 
        MultiSelectLookup=nullable list 
    ],
    ParseData = Table.FromRecords( Source[value], TableSchema , MissingField.UseNull ),
    ParseTableCols = Table.TransformColumns( 
        ParseData, 
        {{
            "MultiSelectLookup", 
            each Table.FromRecords(_, MultiSelectLookupSchema, MissingField.UseNull), 
            MultiSelectLookupSchema
        }} 
    ),
    TransformTypes = Table.TransformColumnTypes(ParseTableCols,{{"Created", type datetime}})
in
    TransformTypes

 

 

Additional notes:

  1. As you can see from my above example, I prefer using guid's rather than titles given that titles can change. You can extract the guid from the url of the list settings page in SharePoint, or query _api/web/lists to get all the lists and their attributes including guid (Id field)
  2. It's not true for some types of queries through the SharePoint api (e.g. querying doc library folder), but definitely when dealing with lists you'll need to factor paging into your query. The api only returns 100 items by default, with a max allowance in one web call of 5000 items if you specify with $top in the query. That means, if you need to return >5000 items, you'll need to do multiple calls.
  3. Re: setting schema, note that most simple values come in as either text, number, or boolean (and pretty sure nothing else). E.g. dates will come in as text in the 'YYYY-MM-DDTHH:NN:SSZ' format. It's still useful IMO to set the schema because a) saves you trouble of removing the extra ID column that always comes in, b) puts columns in the order you specify, and c) allows you to set the type and only use Table.TransformColumnTypes on columns that actually require transformation (okay, last point only applies for the pedantic among us)
  4. Multiselect fields usually show up as lists of records, which, as discussed above, Table.FromRecords is most effective at parsing. You'll initialize these as a list column, and then can take an additional step to transform into the correctly typed table column
  5. In a lot of cases you can actually forgo all the nullable / MissingField.UseNull, but I've found accounting for missing fields makes the query more durable over time.

Edit: minor grammar fixes, trying to fix whitespace, removed application/json;odata=nometadata code snippet I decided not to comment on

Reason for switch was to activate scheduled refresh in power bi service..
Using Odata.Feed was causing dynamic data source error. The only thing that I found from various blogs is using Web.Contents and keeping the dynamic part in RelativePath will fix the issue.
But I could not find anything for xml/Odata content that can be used along with Web.Contents.. That is why had to use Json.Document.
If there is a way for xml/Odata with Web.Contents let me know then. Maybe there is some other methods that I could not find on google search.

Thanks!

Thank You very much for this!
Really helpled in clarifying the fundamentals..
About the 3 times expansion, I am not sure as well. Previously, I did try the Table.FromRecords, but it gave me a conversion error. (don't remember exactly what now ). So, I did not look too much into it as I was getting my final output of all data correctly after all the expansions so did not bother too much 😅 


Table.FromRecords with just the first argument usually will work, but I've found with some SharePoint sources (I think it usually comes down to if even just one item in the query somehow got a field blown away), it will only work if you specify MissingField.UseNull. So, unless I'm being lazy/fast, I'll always include the null (because validator yells at you if you try to leave empty - okay for DAX, not M), and MissingField.UseNull in 2nd and 3rd arguments.

Hi @SDM_1997 ,

 

I did not look too much into it as I was getting my final output of all data correctly after all the expansions so did not bother too much

 

So it seems that your issue has been solved, right? If so , please kindly Accept a helpful reply posted above as the solution to make the thread closed. More people will benefit from it.

 

Best Regards,
Eyelyn Qin

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!   This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions in the Forums to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber of SolutionsSuper UsersNumber of Solutions @anandm08  23 @WarrenBelz  31 @DBO_DV  10 @Amik  19 AmínAA 6 @mmbr1606  12 @rzuber  4 @happyume  7 @Giraldoj  3@ANB 6 (tie)   @SpongYe  6 (tie)     Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersSolutionsSuper UsersSolutions @anandm08  10@WarrenBelz 25 @DBO_DV  6@mmbr1606 14 @AmínAA 4 @Amik  12 @royg  3 @ANB  10 @AllanDeCastro  2 @SunilPashikanti  5 @Michaelfp  2 @FLMike  5 @eduardo_izzo  2   Meekou 2   @rzuber  2   @Velegandla  2     @PowerPlatform-P  2   @Micaiah  2     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 3:Community MembersSolutionsSuper UsersSolutionsPower Apps anandm0861WarrenBelz86DBO_DV25Amik66Michaelfp13mmbr160647Giraldoj13FLMike31AmínAA13SpongYe27     Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 4:Community MembersSolutionsSuper UsersSolutionsPower Apps DBO-DV21WarranBelz26Giraldoj7mmbr160618Muzammmil_0695067Amik14samfawzi_acml6FLMike12tzuber6ANB8   SunilPashikanti8

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages

Users online (659)