cancel
Showing results for 
Search instead for 
Did you mean: 
Reply

Referencing common table in files with various names

To set the stage, I am working on building a flow where users are able to drop excel files in a sharepoint folder. The flow references one specific table and, for each row in the table, creates a new entry on a list with the information of that table row.

 

The hangup is being able to reference a specific table when there is no set file name and referencing the columns in that table when creating the new list entry. Any ideas?

1 ACCEPTED SOLUTION

Accepted Solutions

Here's a couple of examples that will hopefully get what you're after.

 

I have the following SharePoint List.

grantjenkins_0-1668646557179.png

 

And I have a folder in my library where the Excel files will be uploaded to. It doesn't matter what the file names are, as long as they have a Table called SalesTable and the appropriate columns. As an example:

grantjenkins_1-1668646652222.png

 

 

EXAMPLE 1 - Using JSON Schema

 

The full flow is below. I'll go into each of the actions.

grantjenkins_2-1668646760013.png

 

When a file is created (properties only) is set to my Dropoff folder.

grantjenkins_3-1668646803185.png

 

List rows present in a table uses the Full Path from the trigger and has the Table name hardcoded to SalesTable.

grantjenkins_4-1668646867204.png

 

Before moving to the next step, you should save and run the flow, then copy the output from List rows present in a table. We will use this to generate the schema in the next step. Once you've copied the output, go back to edit mode and proceed with the next step.

 

Parse JSON takes in the value from List rows present in a table. To get the schema click on Generate from sample, then paste in the output you copied from the previous step and click on Done.

grantjenkins_5-1668647100819.png

 

In this example, the schema would look like the following:

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "@@odata.etag": {
                "type": "string"
            },
            "ItemInternalId": {
                "type": "string"
            },
            "Title": {
                "type": "string"
            },
            "First Name": {
                "type": "string"
            },
            "Last Name": {
                "type": "string"
            },
            "Email": {
                "type": "string"
            },
            "Country": {
                "type": "string"
            },
            "Company": {
                "type": "string"
            }
        },
        "required": [
            "@@odata.etag",
            "ItemInternalId",
            "Title",
            "First Name",
            "Last Name",
            "Email",
            "Country",
            "Company"
        ]
    }
}

 

The @@odata.etag and ItemInternalId fields are auto-generated. You can just ignore them or remove them from the schema. Also, if some of your columns could potentially contain empty data (not all fields filled in) then you would need to remove those fields from the "required" section. For this example, I'll remove the auto-generated fields and assume that Country is optional (not required). My updated schema would look like:

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Title": {
                "type": "string"
            },
            "First Name": {
                "type": "string"
            },
            "Last Name": {
                "type": "string"
            },
            "Email": {
                "type": "string"
            },
            "Country": {
                "type": "string"
            },
            "Company": {
                "type": "string"
            }
        },
        "required": [
            "Title",
            "First Name",
            "Last Name",
            "Email",
            "Company"
        ]
    }
}

 

The Apply to each iterates over each of the rows (using the Body from Parse JSON), and for each one, creates a new item in the list.

grantjenkins_6-1668647956958.png

 

And that's it - you should now have items been added into your list.

 

 

 

EXAMPLE 2 - NOT using JSON Schema

 

As an alternative option, you could bypass the Parse JSON and reference the fields directly. See full flow below:

grantjenkins_7-1668648800383.png

 

For the Apply to each, you would now just pass in value from List rows present in a table.

 

And for each of the values you would use the following expressions:

//Example
items('Apply_to_each')?['Excel_Column_Name']

//Actual
items('Apply_to_each')?['Title']
items('Apply_to_each')?['First Name']
items('Apply_to_each')?['Last Name']
items('Apply_to_each')?['Email']
items('Apply_to_each')?['Country']
items('Apply_to_each')?['Company']

 

This would give you the same result as option 1 and might be a bit easier for your scenario.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

View solution in original post

4 REPLIES 4

A few questions:

  1. Is the trigger, When a file is created (properties only)?
  2. Will the Table name always be the same, or could it be anything?
  3. If not a specific Table name, will each Excel File potentially contain multiple Tables, or just one?
  4. Will the columns in the Table always be the same?

----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

  1. Is the trigger, When a file is created (properties only)? Yes. 
  2. Will the Table name always be the same, or could it be anything? The table name will always be the same. "Employees" the excel file will be a standardized form but they are often renamed.
  3. If not a specific Table name, will each Excel File potentially contain multiple Tables, or just one? There will be a few tables on the sheet but only one will need to be referenced. 
  4. Will the columns in the Table always be the same? The columns will always be the same as will the column names. There will be a varying number of rows in the table depending on the number of personnel being submitted.

Thanks for the reply! This flow could save my team a lot of time.

Here's a couple of examples that will hopefully get what you're after.

 

I have the following SharePoint List.

grantjenkins_0-1668646557179.png

 

And I have a folder in my library where the Excel files will be uploaded to. It doesn't matter what the file names are, as long as they have a Table called SalesTable and the appropriate columns. As an example:

grantjenkins_1-1668646652222.png

 

 

EXAMPLE 1 - Using JSON Schema

 

The full flow is below. I'll go into each of the actions.

grantjenkins_2-1668646760013.png

 

When a file is created (properties only) is set to my Dropoff folder.

grantjenkins_3-1668646803185.png

 

List rows present in a table uses the Full Path from the trigger and has the Table name hardcoded to SalesTable.

grantjenkins_4-1668646867204.png

 

Before moving to the next step, you should save and run the flow, then copy the output from List rows present in a table. We will use this to generate the schema in the next step. Once you've copied the output, go back to edit mode and proceed with the next step.

 

Parse JSON takes in the value from List rows present in a table. To get the schema click on Generate from sample, then paste in the output you copied from the previous step and click on Done.

grantjenkins_5-1668647100819.png

 

In this example, the schema would look like the following:

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "@@odata.etag": {
                "type": "string"
            },
            "ItemInternalId": {
                "type": "string"
            },
            "Title": {
                "type": "string"
            },
            "First Name": {
                "type": "string"
            },
            "Last Name": {
                "type": "string"
            },
            "Email": {
                "type": "string"
            },
            "Country": {
                "type": "string"
            },
            "Company": {
                "type": "string"
            }
        },
        "required": [
            "@@odata.etag",
            "ItemInternalId",
            "Title",
            "First Name",
            "Last Name",
            "Email",
            "Country",
            "Company"
        ]
    }
}

 

The @@odata.etag and ItemInternalId fields are auto-generated. You can just ignore them or remove them from the schema. Also, if some of your columns could potentially contain empty data (not all fields filled in) then you would need to remove those fields from the "required" section. For this example, I'll remove the auto-generated fields and assume that Country is optional (not required). My updated schema would look like:

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Title": {
                "type": "string"
            },
            "First Name": {
                "type": "string"
            },
            "Last Name": {
                "type": "string"
            },
            "Email": {
                "type": "string"
            },
            "Country": {
                "type": "string"
            },
            "Company": {
                "type": "string"
            }
        },
        "required": [
            "Title",
            "First Name",
            "Last Name",
            "Email",
            "Company"
        ]
    }
}

 

The Apply to each iterates over each of the rows (using the Body from Parse JSON), and for each one, creates a new item in the list.

grantjenkins_6-1668647956958.png

 

And that's it - you should now have items been added into your list.

 

 

 

EXAMPLE 2 - NOT using JSON Schema

 

As an alternative option, you could bypass the Parse JSON and reference the fields directly. See full flow below:

grantjenkins_7-1668648800383.png

 

For the Apply to each, you would now just pass in value from List rows present in a table.

 

And for each of the values you would use the following expressions:

//Example
items('Apply_to_each')?['Excel_Column_Name']

//Actual
items('Apply_to_each')?['Title']
items('Apply_to_each')?['First Name']
items('Apply_to_each')?['Last Name']
items('Apply_to_each')?['Email']
items('Apply_to_each')?['Country']
items('Apply_to_each')?['Company']

 

This would give you the same result as option 1 and might be a bit easier for your scenario.


----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.

Thank you so much! Works like a dream!

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!    This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Community MembersNumber SolutionsSuper UsersNumber Solutions Deenuji 9 @NathanAlvares24  17 @Anil_g  7 @ManishSolanki  13 @eetuRobo  5 @David_MA  10 @VishnuReddy1997  5 @SpongYe  9JhonatanOB19932 (tie) @Nived_Nambiar  8 @maltie  2 (tie)   @PA-Noob  2 (tie)   @LukeMcG  2 (tie)   @tgut03  2 (tie)       Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 2: Community MembersSolutionsSuper UsersSolutionsPower Automate  @Deenuji  12@ManishSolanki 19 @Anil_g  10 @NathanAlvares24  17 @VishnuReddy1997  6 @Expiscornovus  10 @Tjan  5 @Nived_Nambiar  10 @eetuRobo  3 @SudeepGhatakNZ 8     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 3:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji32ManishSolanki55VishnuReddy199724NathanAlvares2444Anil_g22SudeepGhatakNZ40eetuRobo18Nived_Nambiar28Tjan8David_MA22   Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 4:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji11FLMike31Sayan11ManishSolanki16VishnuReddy199710creativeopinion14Akshansh-Sharma3SudeepGhatakNZ7claudiovc2CFernandes5 misc2Nived_Nambiar5 Usernametwice232rzaneti5 eetuRobo2   Anil_g2   SharonS2  

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages