cancel
Showing results for 
Search instead for 
Did you mean: 
Reply

How to? Remove duplicate rows from a Sharepoint List.... Easily or simply?

Hi all, 

 

Id like to be able to create a Flow that would remove rows where a single column entry (a unique ID) is a duplicate. 

In this case its an Incident number, so that would be INC123456 for example.

How can this be done?

 

Ive googled until all the hyperlinks have turned purple. This seems to be the closest I have come to some success.

 

https://tomriha.com/how-to-delete-duplicate-items-in-a-sharepoint-list-with-power-automate/#:~:text=....

 

Still, this doesn't work for me, not a single item is being deleted from my list.

 

Could someone help with a fool proof way for automating this? I don't want to concatenate any of the columns or anything like that. You would think that something like this should be simple ? Why would we want to complicate things further like this?

 

https://www.youtube.com/watch?v=SGXAqAzYUSM

2 ACCEPTED SOLUTIONS

Accepted Solutions
takolota
Multi Super User
Multi Super User

@SuperDude123 

 

Here is a template that finds & removes the smallest/oldest or largest/most recent duplicates from a list that contains up to 100,000 items.

 

https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Find-and-Remove-Duplicates/td-p/2191403

 

Note: Version 1.2 is much faster.

View solution in original post

takolota
Multi Super User
Multi Super User

Here, you can cntrl + c copy this to your clipboard, then go to create a new action in your flow, go to the My clipboard tab, then press cntrl + v to paste it into the clipboard tab & select it.

 

{"id":"5733a2aa-bb41-4e73-a84f-c281e59db276","brandColor":"#8C3900","connectionReferences":{"shared_sharepointonline":{"connection":{"id":"/providers/Microsoft.PowerApps/apis/shared_sharepointonline/connections/shared-sharepointonl-60d1f27e-bd8b-43f2-a8ad-a7be60200f79"}}},"connectorDisplayName":"Control","icon":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMzIiIGhlaWdodD0iMzIiIHZlcnNpb249IjEuMSIgdmlld0JveD0iMCAwIDMyIDMyIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPg0KIDxwYXRoIGQ9Im0wIDBoMzJ2MzJoLTMyeiIgZmlsbD0iIzhDMzkwMCIvPg0KIDxwYXRoIGQ9Im04IDEwaDE2djEyaC0xNnptMTUgMTF2LTEwaC0xNHYxMHptLTItOHY2aC0xMHYtNnptLTEgNXYtNGgtOHY0eiIgZmlsbD0iI2ZmZiIvPg0KPC9zdmc+DQo=","isTrigger":false,"operationName":"Find_And_Remove_Duplicates","operationDefinition":{"type":"Scope","actions":{"Get_items":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_sharepointonline","operationId":"GetItems","apiId":"/providers/Microsoft.PowerApps/apis/shared_sharepointonline"},"parameters":{"dataset":"@outputs('Placeholder_Please_Delete_After_Import')","table":"@outputs('Placeholder_Please_Delete_After_Import')","$orderby":"Created asc"},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{},"description":"Pagination turned on & set to 100000 to get all records up to 100000. Order By ascending (asc) will keep the oldest/smallest & delete the newest/largest, Order By descending (desc) will keep the newest/largest & delete the oldest/smallest.","runtimeConfiguration":{"paginationPolicy":{"minimumItemCount":100000}},"metadata":{"operationMetadataId":"8c675d9e-b2a8-46f6-9864-7f46f72bc7e6"}},"Find_duplicates":{"type":"Scope","actions":{"Select_DuplicateCheckFields":{"type":"Select","inputs":{"from":"@outputs('Get_items')?['body/value']","select":{"RecordJSON":"@AddProperty(item(), 'MakeUniqueGUID', guid())","Title":"@item()?['Title']","Column2":"@item()?['Column2']"}},"runAfter":{},"description":"Do not remove the RecordJSON line in the Map table. Fill the column name & dynamic content value for each data source column you want to include in the check for duplicates. Only records where all listed columns match will count as duplicates.","metadata":{"operationMetadataId":"218007cf-3241-4ebf-906c-563b82e36850"}},"Filter_array_Get_duplicate_records":{"type":"Query","inputs":{"from":"@body('Select_ResetToRecordWithIsDuplicateField')","where":"@equals(item()?['IsDuplicate'], 1)"},"runAfter":{"Select_ResetToRecordWithIsDuplicateField":["Succeeded"]},"description":"Filter to only duplicates. If the original get data action sorts ascending it keeps the oldest/smallest & deletes the newest/largest. If the original get data action sorts descending it keeps the newest/largest & deletes the oldest/smallest.","metadata":{"operationMetadataId":"04080a21-d0d1-4018-9d1b-cb6411877800"}},"Select_ReformatRecordAndDuplicateChecks":{"type":"Select","inputs":{"from":"@reverse(body('Select_DuplicateCheckFields'))","select":{"RecordJSON":"@item()?['RecordJSON']","DuplicateFieldsJSON":"@removeProperty(item(), 'RecordJSON')"}},"runAfter":{"Select_DuplicateCheckFields":["Succeeded"]},"description":"Separates each item into the RecordJSON object & an object containing all the columns used in the duplicates check. Reverse ordering to make user order input for keeping records more intuitive.","metadata":{"operationMetadataId":"323c8da9-040a-47db-8d13-07d3bd68c1d7"}},"Select_ResetToRecordWithIsDuplicateField":{"type":"Select","inputs":{"from":"@body('Select_ReformatRecordAndDuplicateChecks')","select":"@addProperty(item()?['RecordJSON'], 'IsDuplicate', if(greater(length(split(join(skip(split(string(body('Select_ReformatRecordAndDuplicateChecks')), string(item())), 1), ''), string(item()?['DuplicateFieldsJSON']))), 1), 1, 0))"},"runAfter":{"Select_ReformatRecordAndDuplicateChecks":["Succeeded"]},"description":"For each, takes the original record data from the RecordJSON object & adds an IsDuplicate field, calculating if the DuplicateFieldsJSON columns object repeats in the remainder of the dataset - marking each item as a duplicate 1 or not duplicate 0.","metadata":{"operationMetadataId":"e8d367b9-f6a9-472f-899b-716cfa5b650b"}}},"runAfter":{"Get_items":["Succeeded"]},"description":"Take in a JSON array & columns to check for duplicates. Output a JSON array of only the duplicate records found for those columns.","metadata":{"operationMetadataId":"0c3ee8e3-6aec-4537-988e-f8849d28bca6"}},"Apply_to_each":{"type":"Foreach","foreach":"@body('Filter_array_Get_duplicate_records')","actions":{"Delete_item":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_sharepointonline","operationId":"DeleteItem","apiId":"/providers/Microsoft.PowerApps/apis/shared_sharepointonline"},"parameters":{"dataset":"@outputs('Placeholder_Please_Delete_After_Import')","table":"@outputs('Placeholder_Please_Delete_After_Import')","id":"@items('Apply_to_each')?['ID']"},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{},"description":"items('Apply_to_each')?['ID']","metadata":{"operationMetadataId":"6f676b71-f7e9-447c-ae3f-e5544bdb9bf7"}}},"runAfter":{"Find_duplicates":["Succeeded"]},"description":"Use items('Apply_to_each')?['InsertColumnNameHere'] to reference fields within the apply to each loop.","metadata":{"operationMetadataId":"7aaa57db-381b-4442-8ee0-956a4b5aea5f"}}},"runAfter":{"Placeholder_Please_Delete_After_Import":["Succeeded"]}}}

 

 

Then this Select action expression will likely be messed up, so you can remove/erase the expression, select the Map toggle on the middle right of the action twice to remove the disabled grey shade & bring back the editable single box input. Then you can input the following expression again to fix it...

ErrorSelect.png

 

addProperty(item()?['RecordJSON'], 'IsDuplicate', if(greater(length(split(join(skip(split(string(body('Select_ReformatRecordAndDuplicateChecks')), string(item())), 1), ), string(item()?['DuplicateFieldsJSON']))), 1), 1, 0))

 



Note: Version 1.2 here is much faster

https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Find-and-Remove-Duplicates/td-p/2191403

View solution in original post

21 REPLIES 21
takolota
Multi Super User
Multi Super User

@SuperDude123 

 

Here is a template that finds & removes the smallest/oldest or largest/most recent duplicates from a list that contains up to 100,000 items.

 

https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Find-and-Remove-Duplicates/td-p/2191403

 

Note: Version 1.2 is much faster.

takolota
Multi Super User
Multi Super User

@SuperDude123 

 

Here is a template that finds & removes the smallest/oldest or largest/most recent duplicates from a list that contains up to 100,000 items.

 

https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Find-and-Remove-Duplicates/td-p/2191403

Hello Takolata and thanks for replying. I have already tried this template.

At step "The middle actions will process the records, identify the duplicate records, & return a JSON array of only duplicate records." I am unable to follow the instructions. How do I create the JSON at this stage? ITs not part of any defined variables. Are you able to clarify ?

Thanks

Please see the snip here, for example I understand that I need to type that expression at the bottom of the snip. How do I create the top two expressions?Screenshot 2023-09-26 080201.jpg

It’s

item()?['RecordJSON']

and

removeProperty(item(), 'RecordJSON')

 

But you could also have just downloaded the zip file at the end of the post & imported that into Power Automate.

Thank you Takolota. I will look at this template again.

I had already tried to download and upload the template as you suggested. 

I tried this a few times over the past couple of days, and once uploaded I couldnt fine the template I had uploaded.

I tried to search cloud templates for example, but it just wasnt obvious to me where the template had gone once it had been uploaded. 

Many thanks again for your response, I will try to complete the template again using your comments above.

Hello again!

Please see the attached screenshot. 

Body is empty and it doesnt look like Delete Item is being triggered.

I assumed that isDuplicate (Filter array Get duplicate records) should be item()?['isDuplicate'] (correct?)

This is pretty much the same problem I had when I tried to follow the below link. Delete is not being triggered.

https://tomriha.com/how-to-delete-duplicate-items-in-a-sharepoint-list-with-power-automate/#:~:text=....

Should Body be empty?

Thanks in advance!

@SuperDude123 

 

I used IsDuplicate with a capital I. Does your creation expression & the reference from your picture match? Because these are case sensitive.

Thanks, I have changed to upper case capital I.

Still not working.sadly.

Do I need to map additional properties? I am only mapping one property which is the incident number. 

BTW the screenshot shows Apply to each as nested twice. I have confirmed this is not the case, please ignore previous screenshot.

Select DuplicateCheckFields screenshot

takolota
Multi Super User
Multi Super User

Here, you can cntrl + c copy this to your clipboard, then go to create a new action in your flow, go to the My clipboard tab, then press cntrl + v to paste it into the clipboard tab & select it.

 

{"id":"5733a2aa-bb41-4e73-a84f-c281e59db276","brandColor":"#8C3900","connectionReferences":{"shared_sharepointonline":{"connection":{"id":"/providers/Microsoft.PowerApps/apis/shared_sharepointonline/connections/shared-sharepointonl-60d1f27e-bd8b-43f2-a8ad-a7be60200f79"}}},"connectorDisplayName":"Control","icon":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMzIiIGhlaWdodD0iMzIiIHZlcnNpb249IjEuMSIgdmlld0JveD0iMCAwIDMyIDMyIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPg0KIDxwYXRoIGQ9Im0wIDBoMzJ2MzJoLTMyeiIgZmlsbD0iIzhDMzkwMCIvPg0KIDxwYXRoIGQ9Im04IDEwaDE2djEyaC0xNnptMTUgMTF2LTEwaC0xNHYxMHptLTItOHY2aC0xMHYtNnptLTEgNXYtNGgtOHY0eiIgZmlsbD0iI2ZmZiIvPg0KPC9zdmc+DQo=","isTrigger":false,"operationName":"Find_And_Remove_Duplicates","operationDefinition":{"type":"Scope","actions":{"Get_items":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_sharepointonline","operationId":"GetItems","apiId":"/providers/Microsoft.PowerApps/apis/shared_sharepointonline"},"parameters":{"dataset":"@outputs('Placeholder_Please_Delete_After_Import')","table":"@outputs('Placeholder_Please_Delete_After_Import')","$orderby":"Created asc"},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{},"description":"Pagination turned on & set to 100000 to get all records up to 100000. Order By ascending (asc) will keep the oldest/smallest & delete the newest/largest, Order By descending (desc) will keep the newest/largest & delete the oldest/smallest.","runtimeConfiguration":{"paginationPolicy":{"minimumItemCount":100000}},"metadata":{"operationMetadataId":"8c675d9e-b2a8-46f6-9864-7f46f72bc7e6"}},"Find_duplicates":{"type":"Scope","actions":{"Select_DuplicateCheckFields":{"type":"Select","inputs":{"from":"@outputs('Get_items')?['body/value']","select":{"RecordJSON":"@AddProperty(item(), 'MakeUniqueGUID', guid())","Title":"@item()?['Title']","Column2":"@item()?['Column2']"}},"runAfter":{},"description":"Do not remove the RecordJSON line in the Map table. Fill the column name & dynamic content value for each data source column you want to include in the check for duplicates. Only records where all listed columns match will count as duplicates.","metadata":{"operationMetadataId":"218007cf-3241-4ebf-906c-563b82e36850"}},"Filter_array_Get_duplicate_records":{"type":"Query","inputs":{"from":"@body('Select_ResetToRecordWithIsDuplicateField')","where":"@equals(item()?['IsDuplicate'], 1)"},"runAfter":{"Select_ResetToRecordWithIsDuplicateField":["Succeeded"]},"description":"Filter to only duplicates. If the original get data action sorts ascending it keeps the oldest/smallest & deletes the newest/largest. If the original get data action sorts descending it keeps the newest/largest & deletes the oldest/smallest.","metadata":{"operationMetadataId":"04080a21-d0d1-4018-9d1b-cb6411877800"}},"Select_ReformatRecordAndDuplicateChecks":{"type":"Select","inputs":{"from":"@reverse(body('Select_DuplicateCheckFields'))","select":{"RecordJSON":"@item()?['RecordJSON']","DuplicateFieldsJSON":"@removeProperty(item(), 'RecordJSON')"}},"runAfter":{"Select_DuplicateCheckFields":["Succeeded"]},"description":"Separates each item into the RecordJSON object & an object containing all the columns used in the duplicates check. Reverse ordering to make user order input for keeping records more intuitive.","metadata":{"operationMetadataId":"323c8da9-040a-47db-8d13-07d3bd68c1d7"}},"Select_ResetToRecordWithIsDuplicateField":{"type":"Select","inputs":{"from":"@body('Select_ReformatRecordAndDuplicateChecks')","select":"@addProperty(item()?['RecordJSON'], 'IsDuplicate', if(greater(length(split(join(skip(split(string(body('Select_ReformatRecordAndDuplicateChecks')), string(item())), 1), ''), string(item()?['DuplicateFieldsJSON']))), 1), 1, 0))"},"runAfter":{"Select_ReformatRecordAndDuplicateChecks":["Succeeded"]},"description":"For each, takes the original record data from the RecordJSON object & adds an IsDuplicate field, calculating if the DuplicateFieldsJSON columns object repeats in the remainder of the dataset - marking each item as a duplicate 1 or not duplicate 0.","metadata":{"operationMetadataId":"e8d367b9-f6a9-472f-899b-716cfa5b650b"}}},"runAfter":{"Get_items":["Succeeded"]},"description":"Take in a JSON array & columns to check for duplicates. Output a JSON array of only the duplicate records found for those columns.","metadata":{"operationMetadataId":"0c3ee8e3-6aec-4537-988e-f8849d28bca6"}},"Apply_to_each":{"type":"Foreach","foreach":"@body('Filter_array_Get_duplicate_records')","actions":{"Delete_item":{"type":"OpenApiConnection","inputs":{"host":{"connectionName":"shared_sharepointonline","operationId":"DeleteItem","apiId":"/providers/Microsoft.PowerApps/apis/shared_sharepointonline"},"parameters":{"dataset":"@outputs('Placeholder_Please_Delete_After_Import')","table":"@outputs('Placeholder_Please_Delete_After_Import')","id":"@items('Apply_to_each')?['ID']"},"authentication":{"type":"Raw","value":"@json(decodeBase64(triggerOutputs().headers['X-MS-APIM-Tokens']))['$ConnectionKey']"}},"runAfter":{},"description":"items('Apply_to_each')?['ID']","metadata":{"operationMetadataId":"6f676b71-f7e9-447c-ae3f-e5544bdb9bf7"}}},"runAfter":{"Find_duplicates":["Succeeded"]},"description":"Use items('Apply_to_each')?['InsertColumnNameHere'] to reference fields within the apply to each loop.","metadata":{"operationMetadataId":"7aaa57db-381b-4442-8ee0-956a4b5aea5f"}}},"runAfter":{"Placeholder_Please_Delete_After_Import":["Succeeded"]}}}

 

 

Then this Select action expression will likely be messed up, so you can remove/erase the expression, select the Map toggle on the middle right of the action twice to remove the disabled grey shade & bring back the editable single box input. Then you can input the following expression again to fix it...

ErrorSelect.png

 

addProperty(item()?['RecordJSON'], 'IsDuplicate', if(greater(length(split(join(skip(split(string(body('Select_ReformatRecordAndDuplicateChecks')), string(item())), 1), ), string(item()?['DuplicateFieldsJSON']))), 1), 1, 0))

 



Note: Version 1.2 here is much faster

https://powerusers.microsoft.com/t5/Power-Automate-Cookbook/Find-and-Remove-Duplicates/td-p/2191403

I really appreciate your time on this.

So I followed your guide to the best of my understanding. I had an error on pagination so I switched pagination off.

I then changed DuplicateCheckFields to just look for 'Number' which is the incident number.

 I dont want to take up too much of your time on this. Do you have any idea why its failing as it is? What is the Flow failing to do that it should be doing?

Field_0 is Number. and this is the INC number I am trying to remove duplicates of.

Another screenshot showing Enter Valid JsonScreenshot5.png

Its time for me to clock off for the day and go home, but thank you for your help Takolota. I do appreciate your efforts today. Full marks for effort!

I will try again tomorrow, maybe I can find a more simple solution to my very simple requirement. 
Im starting to think powershell would be much easier!

 

@SuperDude123 

 

You need to click into the map input, wait for the pop-up, switch to the expression tab in the pop-up, then insert the expression there.

Fresh start this morning. had to kick myself for making a simple mistake like that. Thanks for pointing that out. Must have been getting tired!
So its saved now, as you requested I have used your imported template. I have saved and tested.

Still I have the same problem. Nothing is being deleted.  Outputs is empty. Delete ITEM doesnt appear to have run.

 

SuperDude123_0-1695800887388.png

 

Ok So now it works. (almost)

Firstly, Thanks Takolota for all your patience, help and support.

Please bear with me while I explain then my experience so far.

 

I have created a script to delete duplicates previously, before starting this thread.

I am finding that I am getting the exact same issue with my previous scripts as I am getting with this one.

 

The problem is that YES, the script does work if I search for duplicate TITLE, or also in this case another field called PRIORITY. The Script/Flow works 100% so thank you!

However, no duplicates are deleted for the NUMBER field, which is the Incident Number ie INC123456

 

Do you have any idea why this could be?

 

I think I have wasted quite a bit of my own time on this then, and also some of your time too. It really wasnt clear to me in testing if the script was working, or it wasnt working because I neglected to try deleting duplicates of other fields/columns.

So I finally have this working, I had to change item()?['field0'] to item()?['Number'] and that looks to have cracked it.

When I selected Number from the Dynamic Content it populated the Map field with item()?['field0'] 
When I selected Title (Flow worked with Title but not with number) the Dynamix Content was populated with item()?['Title'] 

Unsure why this is the case, but changing the field to item()?['Number'] (by typing this in Expression) resolved the problem.

I hope this helps other users struggling with this problem.

 

Very pleased with the support given here. 

 

All the best Takolota. Good Work!

 

SuperDude123_0-1695805186251.png

 

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!    This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Community MembersNumber SolutionsSuper UsersNumber Solutions Deenuji 9 @NathanAlvares24  17 @Anil_g  7 @ManishSolanki  13 @eetuRobo  5 @David_MA  10 @VishnuReddy1997  5 @SpongYe  9JhonatanOB19932 (tie) @Nived_Nambiar  8 @maltie  2 (tie)   @PA-Noob  2 (tie)   @LukeMcG  2 (tie)   @tgut03  2 (tie)       Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 2: Community MembersSolutionsSuper UsersSolutionsPower Automate  @Deenuji  12@ManishSolanki 19 @Anil_g  10 @NathanAlvares24  17 @VishnuReddy1997  6 @Expiscornovus  10 @Tjan  5 @Nived_Nambiar  10 @eetuRobo  3 @SudeepGhatakNZ 8     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 3:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji32ManishSolanki55VishnuReddy199724NathanAlvares2444Anil_g22SudeepGhatakNZ40eetuRobo18Nived_Nambiar28Tjan8David_MA22   Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 4:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji11FLMike31Sayan11ManishSolanki16VishnuReddy199710creativeopinion14Akshansh-Sharma3SudeepGhatakNZ7claudiovc2CFernandes5 misc2Nived_Nambiar5 Usernametwice232rzaneti5 eetuRobo2   Anil_g2   SharonS2  

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages

Users online (947)