cancel
Showing results for 
Search instead for 
Did you mean: 
Reply

500 item limit in CDM entity search filter(need to switch to asp.net app if this exists for long)

500 item limit in CDM entity search filter, this makes it very dofficult to use for any business scenario(export, data analysis)

because I have 50k records and search filter may return sometimes 5k or 20k and I need to analyze this data(so export)

Currently its only exporting 500 first items which does not meet any business criteria(imagine you are doing google search and it returns only 3 items), sadly if this is permanant issue like sp list 5k limit I will have to inform this to our sponsors of the project and most likely as it does not meet business need to filter and export we will have to do asp.net app which we did not want to do.

I will atleast need some good workaround. One thing I observed is there is export data link in CDM screen(can you give me some workaround based on that?)

121 REPLIES 121
mr-dang
Community Champion
Community Champion

I have a temporary solution until delegation works. It is impractical and can introduce flaws in writing data. It is also inefficient in some parts, yet has benefits in others. But if you need those records, this works.

 

Big idea:

  • Import the entire Entity to a temporary collection.
  • Read from the collection instead of the original Entity.
  • If you need to write any data back to the datasource, write it to the temporary collection too.
  • If an entry already exists, look up the original record to modify.

I use @hpkeong's model of a timer. The timer imports 500 records at a time (the limit in PowerApps). To achieve this, every entity must have a field identifying "which 500" each record belongs to (e.g. 766 belongs to the second block of 500 records). I just call my field n" and set it to "number."

 

Import

 

Create a button that activates a timer which will import the next 500 records each time it ends/repeats.

 

 

Button1.OnSelect:
UpdateContext({maxn: First(Sort(datasource,PrimaryId,Descending)).n});
Clear(tempdata); UpdateContext({import: true, iter: 0})

 

Set the Timer1.OnTimerEnd to:

 

 

If(import,
	If(iter<maxn,
		UpdateContext({iter: iter+1});
		
		Collect(tempdata,
			Filter(datasource,n=iter)
		),
		
		UpdateContext({import: false})
	)
)

 

Then set the Timer's properties:

 

Timer1.Duration: 1
Timer1.Start: import Timer1.Repeat: import Timer1.AutoStart: import

 

How does this work?

The variable "maxn" is determined ahead of time--it looks for the a value that describes how many sets of 500 you have. So if you have 10,000 records, and you correctly programmed the way your n field is written, then you would expect maxn to be 20. Unfortunately you can't simply use Max(datasource,n) since delegation is not supported.

 

Click Button1 to begin importing. The "import" variable will activate the timer properties. As long as import is true, it will check for the number of iterations of importing by n that you have done so far. Since it starts at 0, 0 is less than maxn, the expected number of groups of 500 that you want to import. So the timer will import the nth group of 500. This repeats until iter equals maxn, then it deactivates the timer.

 

Flaw 1: if the last record in the datasource is blank for n, then nothing will load. This can easily happen if you are in the middle of writing and the app times out your session.

 

I previously had the Timer compare to see if the next n existed, but that was very slow--foolproof but slow:

 

If(import,
	If(!IsEmpty(Filter(datasource,n=iter+1)),
		UpdateContext({iter: iter+1});
		
		Collect(tempdata,
			Filter(datasource,n=iter)
		),
		
		UpdateContext({import: false})
	)
)

 

Flaw 2: The problem with the method above is that there was a version of PowerApps in which Filtering the datasource by "n=iter+1" had service limitations. iter+1 was too complicated for the formula, so I had to map it to Timer1.Text:

 

Timer1.Text: iter

Timer1.OnTimerEnd:
If(import, If(!IsEmpty(Filter(datasource,n=Timer1.Text+1)), UpdateContext({iter: iter+1}); Collect(tempdata, Filter(datasource,n=Timer1.Text) ), UpdateContext({import: false}) )
)

 

Writing

You need to do the writing in 3 steps: 

  1. Write a new record normally, but copy the new record to a variable.
  2. Take the new record and calculate a correct value for which set of 500 it belongs to. 
  3. Write the same record to the temporary collection that you are using instead of the original datasource.

 

UpdateContext({temprecord:
	Patch(datasource, Defaults(datasource),
		{field1: data,
			field2: data,
		}
	)
});

UpdateContext({temprecord:
	Patch(datasource, First(Filter(datasource,PrimaryId=temprecord.PrimaryId)),
		{n: RoundDown(Value(temprecord.PrimaryId)/500,0)+1
		}
	)
});

Collect(tempdata, temprecord)

 

This gets more complicated if you want to update an existing record--or to check that one exists. Either way, the n value of which set of 500 it belongs to can only be calculated once the column of unique values has been figured out, since it is based on that value. This explains why you need to Patch twice. 

 

I originally collected the temprecord as it was written, but I came across some writing errors. I've listed it below for reference, but YMMV:

 

UpdateContext({temprecord:
	Patch(datasource, Defaults(datasource),
		{field1: data,
			field2: data,
		}
	)
});

Collect(tempdata, Patch(datasource, First(Filter(datasource,PrimaryId=temprecord.PrimaryId)), {n: RoundDown(Value(temprecord.PrimaryId)/500,0)+1 } )
)

 

Flaw 3: Writing needs to access the datasource twice. This could be instant sometimes, but it can also be very slow. 

 

Flaw 4: as mentioned before, if the app closes in the middle of writing, then the n value might not be written. This could break your importing unless you use the slower importing method (see Flaw 1 and 2 of Importing)

 

You could arguably calculate all the n later using UpdateIf or ForAll, but then you would need to figure out a new way of importing when n has not yet been calculated.

 

Final Thoughts

This method is only as good as you can keep data current. If you have multiple users accessing the same data, or even just multiple instances in the webplayer, I do not yet have a solution on syncing data in a reasonable way. In my heaviest entity, reloading about 20 iterations of 500 could take 2 minutes.

 

The ability to open multiple instances is a blessing. I would not trade it for anything. However, it does open possibility for data inaccuracy if you use this method. Other users might make changes to the original datasource, yet you would never know it.

 

The ForAll function came out recently, yet I have not had a good amount of time to play with it yet. I imagine that you could work out a way to import or write multiple with it.

Microsoft Employee
@8bitclassroom
v-yamao-msft
Community Support
Community Support

Hi AmitLoh-Powerap,

 

When I create an app based on an custom entity which has more than 10,000 items, but there are only 500 items shown on the app, though with no Filter/Search function, only 500 items are shown on the app.

 

There is an article about “Import or export data from the Common Data Service”, does this "export" equal to what you said “export data link in CDM screen”?
https://powerapps.microsoft.com/en-us/tutorials/data-platform-export-data/


Best regards,
Mabel Mao

Community Support Team _ Mabel Mao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

thanks all!

 

I just need export for now. I will try timer example but looks lilttle complicated.
https://powerapps.microsoft.com/en-us/tutorials/data-platform-export-data/
This export data link in CDM screen only works when you have less data. If I have 100k records in entity it does not export anything. There is a bug in this export.

 

Hi !
I was able to overcome 500 item limit without using N column and without modifying any existing schema!!
Thanks mr-dang and hpkeong for your suggestions. I used timer to do this with forall without adding any additional N column.

@AmitLoh-Powerap, can you share your timer solution with a timer and ForAll?

Microsoft Employee
@8bitclassroom

I will update over the weekend, its kind of slow but works. It can become fast but I was not able to update forall(forall does not allow to update a varibale)

I have a work around for ForAll not being able to update a variable, but it doesn't always make sense since ForAll does not necessarily go in the order you may want.

 

If you want a Count of things that are finished by ForAll, then you can use:

 

ForAll(datasource,
	[your other actions go here];	
	Patch(universalvariables,First(universalvariables),
		{var: First(universalvariables).var+1
		}
	)
)

 

Then you can reference the variable as First(universalvariables).var.

 

Or you can use Collect and rely on CountRows instead. This requires you to clear the Count earlier:

Clear(countcompleted);

ForAll(datasource,
	[your other actions go here];	
	Collect(countcompleted,
		{Value: 1
		}
	)
)

In this case, you would reference this count as CountRows(countcompleted). ClearCollect does not work.

This is only useful for getting a count of items done by ForAll. I do not yet have a way to update a "variable" for Text or Boolean yet.

 

Microsoft Employee
@8bitclassroom

Hi Dang,

 

Please see my completely different solution!! This works but I need your help!

please see the flow.

The flow has below steps-

For now its based on recurrance for testing-

Recurance

Get number of items for which you want to iterate-I created count cdm table with 20(500*20 so you get 10 k items fast!)

Apply for each

now in this get your actual entity which you want to export(this will run 20 times)--imp add filter with RecordID ge lastcountcdmtable

In same loop use '@last(@{outputs('Compose')}) and get last record thats 500th now I am trying "@parameters('recordid')" Returns 500/or last id as a string to get recordid

Once you get last recordid store it in lastcount cdm table

 

This gives you 20 loops of 500 records each but now I need help in compose(previouscompuse+compose) 20 times and then convert it to csv. I noticed this process is way faster than timer based loop.

500.png

 

for getting last record refer this thread-

https://powerusers.microsoft.com/t5/Flow-Forum/how-to-get-last-row-from-odata-filter-result/m-p/2107...

 

 

My other solution based on timer in powerapps works but its super slow-2 minutes to export 5k records but this one is around 30 seconds to get 10k! I need final 2 things compose all 20 500 sets into single entity and convert to csv either using json or custom api azure function and we are done!!

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!   This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions in the Forums to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber of SolutionsSuper UsersNumber of Solutions @anandm08  23 @WarrenBelz  31 @DBO_DV  10 @Amik  19 AmínAA 6 @mmbr1606  12 @rzuber  4 @happyume  7 @Giraldoj  3@ANB 6 (tie)   @SpongYe  6 (tie)     Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersSolutionsSuper UsersSolutions @anandm08  10@WarrenBelz 25 @DBO_DV  6@mmbr1606 14 @AmínAA 4 @Amik  12 @royg  3 @ANB  10 @AllanDeCastro  2 @SunilPashikanti  5 @Michaelfp  2 @FLMike  5 @eduardo_izzo  2   Meekou 2   @rzuber  2   @Velegandla  2     @PowerPlatform-P  2   @Micaiah  2     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 3:Community MembersSolutionsSuper UsersSolutionsPower Apps anandm0861WarrenBelz86DBO_DV25Amik66Michaelfp13mmbr160647Giraldoj13FLMike31AmínAA13SpongYe27     Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 4:Community MembersSolutionsSuper UsersSolutionsPower Apps DBO-DV21WarranBelz26Giraldoj7mmbr160618Muzammmil_0695067Amik14samfawzi_acml6FLMike12tzuber6ANB8   SunilPashikanti8

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages