Hello!
I am wondering how to format my filter array step to get the following:
I got 2 arrays that are the files from 2 different libraries that are supposed to be "synced" (The same files in both libraries) where one of the libraries are considered the "source" and the other library are just copies.
Now i dont want to loop every item just to check if its already "synced" or not, which is why im trying to do it in a filter array step.
Basically i want to filter out any files that have the same URL and where the modified date is less in the source than in the "mirrored library" (This means that it's already the correct version in the mirrored library).
Any help is very much appreciated. 😃
Maybe you can guide me in the right direction? @Expiscornovus
Solved! Go to Solution.
@StretchFredrik Hopefully this works as expected. It should copy over update files and newly added files. I combined what @Chriddle did with the solution as a lot nicer and easier to handle the new files added.
See full flow below. I'll go into each of the actions.
Get files Library A and Get files Library B are the same as the original solution.
Select B extracts just the Modified date and the Full Path (excluding the Library Name) from Get files Library B. The expression used is:
join(skip(split(item()?['{FullPath}'], '/'), 1), '/')
Select A extracts out the Identifier, Modified date and Path from Get files Library A, plus the matching Modified date from Library B. The expression used to get the Modified date from Library B is:
xpath(
xml(json(concat('{"root": { value:', body('Select_B'), '}}'))),
concat('string(//root/value[FullPath="', join(skip(split(item()?['{FullPath}'], '/'), 1), '/'), '"]/Modified/text())')
)
Filter array uses the output from Select A with the following filter.
//ModifiedB is empty (new file added) or ModifiedA is greater than ModifiedB (file updated in Library A)
@or(
equals(item()?['ModifiedB'], ''),
greater(item()?['ModifiedA'], item()?['ModifiedB'])
)
Apply to each iterates over each of the items in our Filter array.
Copy file uses the following expressions to copy the new/updated files from Library A to Library B.
//File to Copy
item()?['Identifier']
//Destination Folder - NOTE that you would need to put your Library names here
slice(replace(item()?['Path'], 'LibraryA', 'LibraryB'), 0, lastIndexOf(replace(item()?['Path'], 'LibraryA', 'LibraryB'), '/'))
----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
I've got an idea on how to do this within a single Filter array, but just off to sleep now so won't be able to get you something until later (12:30AM for me at the moment). If someone else can provide a solution prior to that then even better 🙂
Are they both Document Libraries that would have identical folder structures/files?
Assuming this would be a scheduled flow that ran daily/weekly?
Hi @StretchFredrik,
Normally I would say, have a look at the Except method described in this blog:
https://pnp.github.io/blog/post/comparing-two-arrays-without-an-apply-to-each/
However, you want to check two things (Url and Modified date time).
Only workaround I can think of at the moment is getting like the max modified date as the latest sync time. Not a great workaround because it doesn't compare it with the modified of the target item itself, but just with the max modified date time of the whole collection of items.
But because it is a sync process, that might be ok? 😁
@and(contains(body('Select_-_Target_Paths'), item()['File']), greater(ticks(item()['Modified']), max(body('Select_-_Target_Modified'))))
Thank you for your reply @Expiscornovus , i would need to compare each items modified date since the sync might take different documents each time since they will be approved or worked on at different times. So a file might not be touched for a year meanwhile another document is worked on daily. The library in question has around 50 000 documents.
Thank you for your reply @grantjenkins
Are they both Document Libraries that would have identical folder structures/files?
Yes they have the same folder structure and files.
Assuming this would be a scheduled flow that ran daily/weekly?
It will run every 15 or 30 minutes, which is why i want it to only spend time on the documents that are out of sync.
Ok, thanks for clarifying. In that case what I provided isn't sufficient.
Let's wait for the response of @grantjenkins. I am sure he can come up with a great solution 😀
Yes, that does not work since the main goal of this is to have one library with ONLY major versions of files and only read permissions @Chriddle.
Yes, that does not work since the main goal of this is to have one library with ONLY major versions of files and only read permissions @Chriddle
Maybe that helps:
I created an array "source" with 500 objects (I hope this is a resonable number of changed files between two flow runs)
and an array "destination" with 50000 objects.
Maybe the amount of destination objects can be reduced by a clever odata filter.
For each object in "source", the Select action "combined" does a lookup in the destinations for an entry with same name and get its "created" with the help of xpath.
You can filter this output with date comparison,
With this amount of values this flow runs in round about 2 minutes.
Of course this is only a POC and you would have to add times, check what happens if the objects are bigger (because of longer names) and probably more 😉
source (Select):
"inputs": {
"from": "@range(0,500)",
"select": {
"name": "@concat('file-',string(item()))",
"created": "@concat('2023-02-', rand(1, 28))"
}
}
destination (Select):
"inputs": {
"from": "@range(0, 50000)",
"select": {
"name": "@concat('file-', string(item()))",
"created": "@concat('2023-02-', rand(1, 28))"
}
}
destinationXML (Compose):
"inputs": "@xml(json(concat('{\"root\":{\"item\":', body('destination'),'}}')))"
combined (Select):
"inputs": {
"from": "@body('source')",
"select": {
"name": "@item()['name']",
"created": "@item()['created']",
"created_destination": "@first(xpath(outputs('destinationXML'), concat('//item[name=\"',item()['name'],'\"]/created/text()')))"
}
}
Filter array:
"inputs": {
"from": "@body('combined')",
"where": "@greater(item()['created'], item()['created_destination'])"
}
I think I might have something that will work.
One concern is the number of files in each library (50,000). We can't just apply a Filter Query within our Get files, so would need to return all files from both libraries (100,000+ files) then apply filtering. The filtering will be quick - just the initial retrieval of files.
The other concern is if there are a lot of files out of sync it will take some time to copy them across which could take quite a while depending on number of files and size of those files. One thing I haven't done here is copied the actual metadata (properties) across. Can easily do that, but not sure if your requirement is just the file sync, or properties too.
If you go with this approach, you may need to run the flow manually a few times to see how long it takes to complete, then schedule the flow accordingly.
See full flow below. I'll go into each of the actions.
Get files Library A and Get files Library B are both using Get files (properties only) actions. They both have the filter FSObjType eq 0 which means only get files (not folders). I've also set the Top Count to 5000 for each of them.
I've also gone into the Settings for Get files Library A and Get files Library B, turned on Pagination, and set the Threshold to 60000 (needs to be a number larger than the number of files you will have over the next couple of years at least). This will take a while to retrieve all your files.
Select extracts out a couple of properties from Get files Library B that we will convert to XML so we can apply XPath within the filter later. The expressions used are:
//FullPath - removes the library name from the full path
join(skip(split(item()?['{FullPath}'], '/'), 1), '/')
//Modified - replaces characters so we are left with a number (required for XPath comparison)
replace(replace(replace(replace(item()?['Modified'], '-', ''), 'T', ''), ':', ''), 'Z', '')
Filter array uses the output from our Select and the following expression to filter our items that have been updated in Library A since being copied to Library B (in need of updating). It uses an XPath expression to compare both the FullPath (including Filename) and the Modified Date. And if the length of items returned is greater than 0 then we need to update the item.
@greater(
length(
xpath(
xml(json(concat('{"root": { value:', body('Select'), '}}'))),
concat('//root/value[FullPath = "', join(skip(split(item()?['{FullPath}'], '/'), 1), '/'), '" and Modified < "', replace(replace(replace(replace(item()?['Modified'], '-', ''), 'T', ''), ':', ''), 'Z', ''), '"]')
)
),
0
)
Apply to each iterates over each of the items in our Filter array (files that need to be updated).
Copy file uses the following expressions for File to Copy and Destination folder.
//File to Copy
item()?['{Identifier}']
//Destination Folder - NOTE that you would need to put your Library names here
slice(replace(item()?['{Path}'], 'LibraryA', 'LibraryB'), 0, lastIndexOf(replace(item()?['{Path}'], 'LibraryA', 'LibraryB'), '/'))
----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
I think i got it close to working, will test a bit more tomorrow. But i would need the the filter array to also include any items that don't already exist in destination. @grantjenkins
Thank you for your help this far 😃
@StretchFredrik Yea I was thinking about that and already started a bit of a redesign, but off to sleep now, so won't be able to get anything to you until a bit later. I know how to achieve it, but just need to rebuild a part of it.
Also, what if you delete an item from the source - would you also need to delete that from the destination?
Sounds awesome take your time and sleep well. the delete part ive already done by comparing urls with the filter array step. It then loops the ones present in destination and not in source and tries to find the file by document-id to see if its moved in source but not published yet. If it doesnt find it by document-id, it gets deleted.
Thank you for your time @grantjenkins
@StretchFredrik Hopefully this works as expected. It should copy over update files and newly added files. I combined what @Chriddle did with the solution as a lot nicer and easier to handle the new files added.
See full flow below. I'll go into each of the actions.
Get files Library A and Get files Library B are the same as the original solution.
Select B extracts just the Modified date and the Full Path (excluding the Library Name) from Get files Library B. The expression used is:
join(skip(split(item()?['{FullPath}'], '/'), 1), '/')
Select A extracts out the Identifier, Modified date and Path from Get files Library A, plus the matching Modified date from Library B. The expression used to get the Modified date from Library B is:
xpath(
xml(json(concat('{"root": { value:', body('Select_B'), '}}'))),
concat('string(//root/value[FullPath="', join(skip(split(item()?['{FullPath}'], '/'), 1), '/'), '"]/Modified/text())')
)
Filter array uses the output from Select A with the following filter.
//ModifiedB is empty (new file added) or ModifiedA is greater than ModifiedB (file updated in Library A)
@or(
equals(item()?['ModifiedB'], ''),
greater(item()?['ModifiedA'], item()?['ModifiedB'])
)
Apply to each iterates over each of the items in our Filter array.
Copy file uses the following expressions to copy the new/updated files from Library A to Library B.
//File to Copy
item()?['Identifier']
//Destination Folder - NOTE that you would need to put your Library names here
slice(replace(item()?['Path'], 'LibraryA', 'LibraryB'), 0, lastIndexOf(replace(item()?['Path'], 'LibraryA', 'LibraryB'), '/'))
----------------------------------------------------------------------
If I've answered your question, please mark the post as Solved.
If you like my response, please consider giving it a Thumbs Up.
Thank you @grantjenkins , the filter array step fails on "Greater expects all of its paramteres to be either integer or decimal numbers, Found invalid parameter type "Null". So guessing the filter array query fails when the modifiedB is null(empty).
Did you put the filters in the same order that I had? If ModifiedB is empty, then it wouldn't try to evaluate the second filter.
This is what i have, should be the same:
Nevermind, im stupid, i forgot to change the input of filter array @grantjenkins . Will try again! 😃
Dear Community Members, We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd. During this period, members will not be able to Kudo, Comment, or Reply to any posts. On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities. What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!
We are excited to announce the Summer of Solutions Challenge! This challenge is kicking off on Monday, June 17th and will run for (4) weeks. The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions to as many questions as you can. Answers can be provided in all the communities. Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”) - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024 Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods. You must enter into each weekly Entry Period separately. How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different. Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers. At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing. Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner. A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random. Individuals will be contacted before the announcement with the opportunity to claim or deny the prize. Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners. Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event. ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE** Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber SolutionsSuper UsersNumber Solutions Deenuji 9 @NathanAlvares24 17 @Anil_g 7 @ManishSolanki 13 @eetuRobo 5 @David_MA 10 @VishnuReddy1997 5 @SpongYe 9JhonatanOB19932 (tie) @Nived_Nambiar 8 @maltie 2 (tie) @PA-Noob 2 (tie) @LukeMcG 2 (tie) @tgut03 2 (tie) Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 2: Community MembersSolutionsSuper UsersSolutionsPower Automate @Deenuji 12@ManishSolanki 19 @Anil_g 10 @NathanAlvares24 17 @VishnuReddy1997 6 @Expiscornovus 10 @Tjan 5 @Nived_Nambiar 10 @eetuRobo 3 @SudeepGhatakNZ 8 Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 3:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji32ManishSolanki55VishnuReddy199724NathanAlvares2444Anil_g22SudeepGhatakNZ40eetuRobo18Nived_Nambiar28Tjan8David_MA22 Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 4:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji11FLMike31Sayan11ManishSolanki16VishnuReddy199710creativeopinion14Akshansh-Sharma3SudeepGhatakNZ7claudiovc2CFernandes5 misc2Nived_Nambiar5 Usernametwice232rzaneti5 eetuRobo2 Anil_g2 SharonS2
On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners. The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights. Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.
We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for. We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer. Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together. Here's to new beginnings and endless possibilities! If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport. To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages