cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Anonymous
Not applicable

ClearCollect for more than 2000 records

Hello all,

 

I have tried ClearCollect-ing a Sharepoint List that holds more than 2100 records.

 

But unfortunately, when I CountRows for the ClearCollect, it only gives 2000 records.

 

And I also tried to ClearCollect with filtering the Data Source. It only provided me with filtration on the first 2000 records.

 

How can I overcome this?

22 REPLIES 22

 

 

smiles.gif

 

Hi @dinusc ,

 

I'd like to use Option 2 you mention but have no idea how to implement it. I've used the Collect(dsfinal,ds1,ds2) as shown below by Drrickryp, however my collections only retrieve the first 2000 rows for each collection ... my data is over 6000 rows.

 

How would you collect 1-500; 501-1000; etc. , what would the formula look like, as I'm not sure how that would work and cannot find any examples.

 

I've also tried to Collect(dsfinal(If(CustomersList.Custgrp = GalleryAgencies.Selected.Custgrp)2000) but without success. I was hoping the Agency would be the factor in collecting <2000 items at a time, but it's just not working.

 

Thanks in advance.

Anonymous
Not applicable

@Bren0074  I built a solution for this last week because I was having the exact same issues. A warning though, it's not elegant - but it does work 🙂

 

The solution is to use SharePoint ID numbers.

1. First, sort by ID Descending, collect that into collection1 (col1)

2. Then, sort by ID Ascending, collect that into collection2 (col2)

3. Build an array of 'Missing ID numbers' as a collection - call it colTarget. Calculated by (MinID col1) - (MaxID col2)

4. Use ForAll() with Collect() to get the missing ID rows one by one into collection3 (col3)

5. [optional] Collect col1, col2 & col3 into finalCol

 

Step 4 can take awhile if 'the gap' between Min/Max is large so just be mindful of that and maybe think about if you need all the data in your App? I put a label in my Dev App and set it to CountRows(col3) and you can see it tallying up as data gets collected - gives you an idea of data collection speed.

 

You could always build a smaller array (Step 3) of missing IDs that gets tacked onto col1 if you don't need all of the data Eg. (MinID col1) - 500 instead of (MinID col1) - (MaxID col2)?

 

Apologies for all the blah above, here's my code solution:

Note: my Step 3 is adjusted from @Prem_ddsl's cool code here Dynamic Loops

 

Step 1:

ClearCollect(col1,SortByColumns(yourSPList,"ID",Descending))

 

Step 2:

ClearCollect(col2,SortByColumns(yourSPList,"ID",Ascending))

 

Step 3:

//specify target rowcount

Set(vMaxRows,Min(col1,ID)-Max(col2,ID))

 

//define a base collection from which you will generate your target collection. A colBase of 60 will be able to collect up to 60x60 = 3600 items. Set colBase's size to meet your needs.
ClearCollect(colBase,[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60]);


//setup collections to be used as inner and outer counters
ClearCollect(colOuter,colBase);ClearCollect(colInner,colBase);Set(vOuterMax,CountRows(colOuter));


//generate array of missing ID numbers 
ForAll(colOuter,ForAll(colInner,Collect(colTarget,{RowId:(Max(col2,ID)+colInner[@Value])+(vOuterMax*(colOuter[@Value]-1))})));

 

RemoveIf(colTarget,RowId>=Min(col1,ID));  // remove unwanted ID numbers

 

Step 4:

Clear(col3));

ForAll(colTarget,Collect(col3,Filter(yourSPList,ID=RowId)));

 

Step 5:

ClearCollect(finalCol, col1, col2, col3)

 

As  I say, not elegant but effective. Hope this helps 🙂

 

Cheers

 

Anonymous
Not applicable

Ooops, the 'sad face' emoji should be:

: ( 

 

ie 

ForAll(colOuter,ForAll(colInner,Collect(colTarget,{RowId : ( Max(colSiteInduct2,ID)+colInner[@Value])+(vOuterMax*(colOuter[@Value]-1))})))

Anonymous
Not applicable

@Anonymous   Thanks for this - really excellent work. One thing I wondered about is whether it's possible to modify this to include a filter on the data source to remove unecessary items? e.g. I have a list with 40,000 items, but I only need to pull in ~7000 items, based on a status text field. Therefore I'd like to filter out the 33,000 items which are in a "Completed" status. Is this achievable using this same method?

Hi there,

 

i've a data source of about 16000 rows, so i've tried to workaround the 2000 limit, by implement your code, @dinusc : 

 

ClearCollect(DummyCollection;Filter(DataDedo;"BatchID1" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID2" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID3" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID4" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID5" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID6" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID7" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID8" in BatchID));;
Collect(DummyCollection;Filter(DataDedo;"BatchID9" in BatchID))

but i'm stuck, 'cause my DummyCollection only displays 2000 rows...

have i misunderstood your proposal ?

Thanks

 

@GeekAlf_Pro, I suspect your multiple "Collect" formulas do not pull any data (since they're not delegable) and this is the reason you can't collect them. Here is how non-delegable functions work:

1. You define your search criteria which uses non-delegable functions.

2. The function pulls the unfiltered batch of records to the client side (up to 2K).

3. The function applies the filtering on the client side (since the function cannot be delegated to the back end).

In case when none of the records from the retrieved batch correspond to the search criteria then the search result is empty. 

My suggestion was to use sub-collections with delegable functions (something like ID > 5000 && ID < 10000) to pull all the records to the client and then apply the desired filtering. Please note that this will cause your client app to use a significant amount of memory (which is one of the main reasons for the record limitation in the first place) and you should not do this if your application will be used on older mobile devices with limited memory and/or low speed network connections.

Hi @dinusc,

 

thank you very much for your quick response.

 

My comments below

 

 


@dinusc wrote:

@GeekAlf_Pro, I suspect your multiple "Collect" formulas do not pull any data (since they're not delegable) and this is the reason you can't collect them. Here is how non-delegable functions work: [GeekAlf_Pro] Right ! it's an OneDrive Excel Sheet

1. You define your search criteria which uses non-delegable functions.

2. The function pulls the unfiltered batch of records to the client side (up to 2K). [GeekAlf_Pro] That's where I don't understand, I thought that filtering with the BatchID would allow me to enrich my local collection, since I filter to be below the threshold.
If I understand you correctly, you're explaining to me that filtering is useless, right?

3. The function applies the filtering on the client side (since the function cannot be delegated to the back end).

In case when none of the records from the retrieved batch correspond to the search criteria then the search result is empty. 

My suggestion was to use sub-collections with delegable functions (something like ID > 5000 && ID < 10000) to pull all the records to the client and then apply the desired filtering. Please note that this will cause your client app to use a significant amount of memory (which is one of the main reasons for the record limitation in the first place) and you should not do this if your application will be used on older mobile devices with limited memory and/or low speed network connections.[GeekAlf_Pro] i try and make a comeback. thanks again.




To provide more information about #2 below, filtering with batchId is no different that any other filtering. "Sectioning" data into batches idea is about retrieving smaller batches using simple and delegable functions. In your example, you cannot apply non-delegable filter to 16K records. So the idea is to pull all 16K records to the client using simpler and delegable functions and then apply the desired filter locally (on the client). This is not always straightforward to implement because, as mentioned in the original post, you may need to re-design your backend storage (for example by adding an unique numeric index column - batch ID). 

Thanks for point towards Mr Dang's solution. You could also use a Flow to get the "number of items in your datasource" i.e an ItemCount

 

 

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!   This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions in the Forums to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber of SolutionsSuper UsersNumber of Solutions @anandm08  23 @WarrenBelz  31 @DBO_DV  10 @Amik  19 AmínAA 6 @mmbr1606  12 @rzuber  4 @happyume  7 @Giraldoj  3@ANB 6 (tie)   @SpongYe  6 (tie)     Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersSolutionsSuper UsersSolutions @anandm08  10@WarrenBelz 25 @DBO_DV  6@mmbr1606 14 @AmínAA 4 @Amik  12 @royg  3 @ANB  10 @AllanDeCastro  2 @SunilPashikanti  5 @Michaelfp  2 @FLMike  5 @eduardo_izzo  2   Meekou 2   @rzuber  2   @Velegandla  2     @PowerPlatform-P  2   @Micaiah  2     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 3:Community MembersSolutionsSuper UsersSolutionsPower Apps anandm0861WarrenBelz86DBO_DV25Amik66Michaelfp13mmbr160647Giraldoj13FLMike31AmínAA13SpongYe27     Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 4:Community MembersSolutionsSuper UsersSolutionsPower Apps DBO-DV21WarranBelz26Giraldoj7mmbr160618Muzammmil_0695067Amik14samfawzi_acml6FLMike12tzuber6ANB8   SunilPashikanti8

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages