cancel
Showing results for 
Search instead for 
Did you mean: 
Reply

Remove duplicates by prioritizing rows based on another column

Hi, everyone.

 

I am having the following problem and don't seem to find any solution to implement this in Power Query.

I have a dataset of loading lists of orders. It looks like this:

 

SergeyShelest_1-1630059451503.png

 

 For the given Loading list (Ladeliste) I want to remove duplicate NVE numbers (shipping unit number) but only those that have empty values in LHM number column (loading unit ID). If a corresponding cell in LHM-Nr. column is not empty I want to keep it. For example, when removing duplicates in rows 2 and 6, I want to remove row 2 since LHM-Nr. column cell is empty. Correspondingly, between rows 3 and 7 I want to remove row 3. 

 

This is how the resulting table should look:

SergeyShelest_2-1630059719434.png

Removing duplicates in Power Query using Remove rows removes first rows. Sorting by loading list and then deleting rows might work but I am not sure if it will also not be deleting rows that have values in LHM-Nr. column.

 

Does anyone have any idea how to implement this?

1 ACCEPTED SOLUTION

Accepted Solutions

Okay, please try this.  I have included comments that explain each step.

 

BEFORE: 

The goal is to remove rows 1, 3, 6, 8 and to have all columns present in the result.

jennratten_0-1630267603190.png

 

RESULT:

jennratten_1-1630267728521.png

 

SCRIPT:

There are two different options in the grouped step along with scenarios for when one would apply as opposed to the other, based on the specifics of the source data.  Currently, option 2 is in use.  To switch to option 1, add two forward slashes in front of Table.LastN and remove the two slashes at the beginning of let varTable and Table.FirstN.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jcy5DcAgEETRXjZGaHbBVwi+irDovw0vJiFAmOQjxGOeh0IIZIgB6BHiroW3YCsQ1kt+pGT+ISNXKiu9UTcEGaL1n40x5n/F7sfZGJ0q6HtwHoIMp10qOxV7XndjdB2CDK/dKKUX", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Latest = _t, #"NVE-Nr." = _t, Item = _t, Date = _t, #"LHM-Nr." = _t, Index = _t]),
    grouped = Table.Group(
        Source, 
        // Column(s) containing the values from which you'd like duplicates removed.  
        {"NVE-Nr."},
        {
            { "Table", // Name of the new column.
                each 
                //---------------------------------------------------------
                // Option 1: Sort descending, then select the first result.
                //---------------------------------------------------------
                // This will work if there is a maximum of only two rows per NVE-Nr.
                //let varTable = Table.Sort ( _, {{"LHM-Nr.", Order.Descending}}) in
                //Table.FirstN ( varTable, 1 ),
                //---------------------------------------------------------
                // Option 2: Select the last LHM-Hr for each NVE-Nr.
                //---------------------------------------------------------
                // This will work if the row to keep always appears last in the group.
                Table.LastN ( _, 1 ),                
                type table
            }
        }
    ),
    expand = Table.ExpandTableColumn ( 
        grouped, 
        "Table",                                  // Expand the tables in this column
        List.Difference (                         // New column names
            Table.ColumnNames (                   // are the column names 
                Table.Combine ( grouped[Table] )  // in the nested tables 
            ), 
            Table.ColumnNames ( grouped )         // that do not appear in the grouped table.
        ) 
    )

in
    expand

 

 

 

View solution in original post

24 REPLIES 24

How to get good help fast. Help us help you.

How To Ask A Technical Question If you Really Want An Answer

How to Get Your Question Answered Quickly - Give us a good and concise explanation
How to provide sample data in the Power BI Forum - Provide data in a table format per the link, or share an Excel/CSV file via OneDrive, Dropbox, etc.. Provide expected output using a screenshot of Excel or other image. Do not provide a screenshot of the source data. I cannot paste an image into Power BI tables.

Hi, edhans. 

 

Here is a sample data:

 

https://docs.google.com/spreadsheets/d/12v9392oeTLGUqKC98SZLt-cZVHAh4dhA/edit?usp=sharing&ouid=10760...

 

Please let me know if you need any clarifications.

 

Regards,

I might have a simpler solution. From this data set, I can see that there are only duplicates in NVE when LHM-Nr. is empty. In theory, you could just remove all rows where LHM-Nr. is empty and it would consequently remove duplicates from NVE. For this, you can click on the LHM-Nr. column, and on the top of Power Query click on Remove Rows+, then Remove Blank Rows. This should work.

 

Otherwise, you could just do what you said and sort by loading list. Since the rows move as a whole, you wouldn't mix any infos and could then remove duplicates, since the duplicate row without empty values would be on top. Hope this helps.

Hi, @JoaoPereira .

 

Thanks for the tip. Unfortanutely this will lead to data loss since I have data of over 12 000 rows.. Since I imported data from pdf and did the cleaning, I have some loading lists where Power Query didn't catch LHM numbers but only NVE numbers. If I delete all rows with empty LHM numbers it will delete this information as well, which I would like to avoid.

 

That's why I need a delicate solution so that it wouldn't lead to any information loss.

 

Regards,

Alright, the solution I came up with is a bit more complicated than it probably needed to be but it's all I got. Here is what I recommend:


Duplicate the LHM column. In the Transform section, divide column by Number of Characters (position 26) and by specific character (.). This will leave you with two columns, one with a 4 digit number (3422 for example) and a second column with the word pdf for the separator. Delete the pdf column. 

 

Now, duplicate the NVE column. 

 

Now, join the duplicate LHM column with the 4 digits and the NVE duplicate column. Since every NVE is unique, the only duplicate values in this new column will be when the 4 digits from the LHM are identical AND when the NVE column was blank, therefore not adding anything to the values.

 

Finally, click on the joined column from both duplicates and erase duplicates. From here, you can delete the joined column since the functions work without it. There you have it, you'll have a data set with duplicates but only when NVE is not empty. 

Hello - here is an option for returning the result.

 

I have created a sample table with similar specs.  The goal is to keep rows 2, 3, 5, 7, 8.

 

BEFORE

jennratten_0-1630088951723.png

 

AFTER

jennratten_1-1630089014823.png

 

SCRIPT

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUXJ0cgaSSrE6yFzfxJKMYrCYEZDn5OwCJF3z0nMyizPQRMFcYyDD2cUVRSNCDMw1ATJcXN1gXFOQgW7uyMbGAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Student ID" = _t, #"Student Name" = _t, Subject = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Student ID", Int64.Type}, {"Student Name", type text},  {"Subject", type text}}),
    #"Grouped Rows" = Table.Group(
        #"Changed Type", 
        {"Student ID", "Student Name"}, 
        {
            { "Subject", each List.First ( List.Sort ( _[Subject], Order.Descending ) ) },
            { "All", each _, type table [Student ID=nullable number, Student Name=nullable text, Subject=nullable text]}}
    )
in
    #"Grouped Rows"

 

 

Hi, @jennratten .

 

Thank you for your solution.

 

I am not that advanced user in Power Query, that's I don't really fully understand what the code does. Could you please explain what the code does so that I adjust it to my case? Why do I actually need index column of Student ID and what does it do? Do I need to also create an index column in my case?

 

Regards  

Hi - no problem at all.  You don't need an index column.  That just happened to be the column in my sample data that I wanted to use as the column from which duplicates should be removed.  See this new script below.  I have added comments and examples of what you need to change to make it work for you.  Please let me know if you have any other questions.

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUXJ0cgaSSrE6yFzfxJKMYrCYEZDn5OwCJF3z0nMyizPQRMFcYyDD2cUVRSNCDMw1ATJcXN1gXFOQgW7uyMbGAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Student ID" = _t, #"Student Name" = _t, Subject = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Student ID", Int64.Type}, {"Student Name", type text},  {"Subject", type text}}),
    #"Grouped Rows" = Table.Group(
        #"Changed Type", 
        // Column(s) containing the values from which you'd like duplicates removed.  
        // Replace this with the NVE numbers column. {"NVE-Nr."}
        {"Student ID", "Student Name"}, 
        {
            // Column that contains blank/empty/null values that should be removed if 
            // another row exists that is not blank/empty/null.
            // "Subject" will be the name of the new column.
            // _[Subject] is the column in the current table whose values should be evaluated.
            // Your column name in the current table will need to be referenced a little differently
            // since it contains a hyphen.
            // { "LHM-Nr.", each List.First ( List.Sort ( _[#"LHM-Nr."], Order.Descending ) ) },            
            { "Subject", each List.First ( List.Sort ( _[Subject], Order.Descending ) ) },
            { "All", each _, type table }}
    )
in
    #"Grouped Rows"

Hey,

 

thanks again for your help.

 

It has worked but my goal is to get the duplicates removed and also pertain all the other information in the other columns as well. Now I only have two columns of NVE and LHM numbers with grouped tables that have the duplicate values and all the information I need. When I expand the tables I again get all the duplicates -- seems like I have to go over it again.  Is there a way to get the duplicates removed (as you already showed) and pertain all the other columns just next to the sorted data (after removing duplicates) so that I would have clean table with all the information I need?

SergeyShelest_0-1630179176344.png

 

Regards

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!   This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions in the Forums to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber of SolutionsSuper UsersNumber of Solutions @anandm08  23 @WarrenBelz  31 @DBO_DV  10 @Amik  19 AmínAA 6 @mmbr1606  12 @rzuber  4 @happyume  7 @Giraldoj  3@ANB 6 (tie)   @SpongYe  6 (tie)     Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersSolutionsSuper UsersSolutions @anandm08  10@WarrenBelz 25 @DBO_DV  6@mmbr1606 14 @AmínAA 4 @Amik  12 @royg  3 @ANB  10 @AllanDeCastro  2 @SunilPashikanti  5 @Michaelfp  2 @FLMike  5 @eduardo_izzo  2   Meekou 2   @rzuber  2   @Velegandla  2     @PowerPlatform-P  2   @Micaiah  2     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 3:Community MembersSolutionsSuper UsersSolutionsPower Apps anandm0861WarrenBelz86DBO_DV25Amik66Michaelfp13mmbr160647Giraldoj13FLMike31AmínAA13SpongYe27     Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 4:Community MembersSolutionsSuper UsersSolutionsPower Apps DBO-DV21WarranBelz26Giraldoj7mmbr160618Muzammmil_0695067Amik14samfawzi_acml6FLMike12tzuber6ANB8   SunilPashikanti8

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages

Users online (548)