cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
mr-dang
Community Champion
Community Champion

Migrating from Excel to CDM

The write speed to CDM is looking very attractive. Currently, Patch takes 3s in an Excel spreadsheet while CDM is real-time as writing should be.

 

That said, the two seem to operate very differently. When I remove an Excel datasource and attempt to add a CDM datasource of the same name, PowerApps can't handle delegation of all my formulas at once--understandable. RAM usage spikes to a whopping 10GB and CPU usage to 100%.

 

It seems that Excel datasources are a snapshot of the data and is not updated until you perform a write or refresh function. Because of this, it is not constantly reading from the datasource (you don't see the dots roll across the top of the screen). CDM on the other hand is reading from the datasource real time, so it's always swirling at the top. You really shouldn't use too many Filter() on the RAW source. It seems to work better to Collect it to a temporary collection and recollect whenever you make changes.

 

I am not a programmer by trade, so I am wondering: do most users work in temporary collections of the original datasources?

 

This whole time I kept thinking I wanted to make all of my formulas to operate on the original source--but it seems I am finding faster performance by working locally. If I could read from more than 500 records, I think I won't need complete delegation after all since I would just work in collections.

Microsoft Employee
@8bitclassroom
1 ACCEPTED SOLUTION

Accepted Solutions
Steelman70
Power Participant
Power Participant

Hi Mr Dang, I work with both temporary collections and original sources.  I use temporary collections mostly for menu items in drop-downs that do not change during a work session.  It speeds things up with relatively few side effects.

 

I think it is better to work with original sources for many reasons*, but currently there is a performance issue as I mentioned in another one of your posts.  I suspect that as PowerApps and CDM develop, the programme team should introduce optimisations in PowerApps itself such as accessing the data source only when really needed.  Then we will not need to use temporary collections at all.

 

Anyway in one app I used the ID filter to load >500 records (actually 1600) to my temporary collection and this works fine although a bit clumsy.  Something like this:

ClearCollect(MyTemporaryCollection, Filter(MyDataSource, ID<500));
Collect(MyTemporaryCollection, Filter(MyDataSource, ID>=500 && ID<1000));
Collect(MyTemporaryCollection, Filter(MyDataSource, ID>=1000 && ID<1500));
Collect(MyTemporaryCollection, Filter(MyDataSource, ID>=1500))

 

 In any case this is only a work-around until the optimisations are introduced in PowerApps itself.

 

* Simpler apps, fewer issues with data synchronisation, immediate feed-back in case of input errors etc. etc.

View solution in original post

3 REPLIES 3
Steelman70
Power Participant
Power Participant

Hi Mr Dang, I work with both temporary collections and original sources.  I use temporary collections mostly for menu items in drop-downs that do not change during a work session.  It speeds things up with relatively few side effects.

 

I think it is better to work with original sources for many reasons*, but currently there is a performance issue as I mentioned in another one of your posts.  I suspect that as PowerApps and CDM develop, the programme team should introduce optimisations in PowerApps itself such as accessing the data source only when really needed.  Then we will not need to use temporary collections at all.

 

Anyway in one app I used the ID filter to load >500 records (actually 1600) to my temporary collection and this works fine although a bit clumsy.  Something like this:

ClearCollect(MyTemporaryCollection, Filter(MyDataSource, ID<500));
Collect(MyTemporaryCollection, Filter(MyDataSource, ID>=500 && ID<1000));
Collect(MyTemporaryCollection, Filter(MyDataSource, ID>=1000 && ID<1500));
Collect(MyTemporaryCollection, Filter(MyDataSource, ID>=1500))

 

 In any case this is only a work-around until the optimisations are introduced in PowerApps itself.

 

* Simpler apps, fewer issues with data synchronisation, immediate feed-back in case of input errors etc. etc.

Using ID looks like a good interim solution. 

 

I took your suggestion and tweaked it in a way that I can transition when the time is right.

  • My temporary collection for pulling in 500 records at a time has the same name as what my final Connected datasource will have. Until PA can read more than 500 records, I'm using a dummy entity which I will duplicate to have the name of the temporary collection. 

    I think this makes sense since my formulas would operate on the temporary collection anyway. Then I would only need to connect the final datasource and tweak any formulas that write back to the source to transition.
  • PA will not operate more than one condition in each Filter--at least in CDM. 

    I added a column to my entity which divides Title by 500 to segment each block of 500 for collecting. I first tried it with AddColumns, but it did not work, so I put it in my formulas for writing back to source:
    n: RoundDown(Value(Right(Title,10)),0)+1
  • I can now use your formula, but with a combination of a toggle, an OnVisible property, a timer, and a slider to make it loop. The OnVisible property resets the Toggle. The Toggle reloads the datasource. It simulates Refresh(MyDataSource) for my temporary collection. 
    Screen.OnVisible: 
    UpdateContext({resetdata: !resetdata});
    UpdateContext({resetdata: !resetdata})
  • Toggle1.Reset:
    resetdata
    
    Toggle1.Default: 
    resetdata
    
    Toggle1.OnCheck:
    Refresh(MyDataSource);
    Clear(MyTemporaryCollection);
    UpdateContext({pulldata: true, iter: 0})
  • Timer1.OnTimerEnd:
    If(pulldata,
    
    If(iter<Slider1.Value,
    UpdateContext({iter: iter+1});
    Collect(MyTemporaryCollection, Filter(MyDataSource, n=iter)),
    
    UpdateContext({pulldata: false})
    )
    This is an absurd workaround for looping. CountRows(MyDataSource), Max(MyDataSource,n), Last(MyDataSource).Title only operate in the first 500 records which is like a black box. There's no way to find out how many iterations you could do so I just stuck in a slider for now, which I can edit later.

 

I think this will minimize the number of things I need to edit in order to transition to a datasource that can read more than 500 records.

 

 

Microsoft Employee
@8bitclassroom

Revision--my last post did not correctly address writing back to the source. It assumed that an n number that identified which group of 500 a record belongs to was already known.

 

In solving this problem, I have found it gets less and less practical a solution. Here's what I had to do:

  1. Patch normally, except for the n value. RecordId is written automatically and Title will be too if you change it to "Number Sequence" when you first form the entity. However, those two columns are not known at the time of Patch, so you need to repatch.
  2. Set a variable equal to what you just patched.
  3. Repatch the variable, but this time, RecordId and Title are known, so you can divide either by 500 to find out which group they belong to for Collection.

This is what it looks like in a formula:

 

UpdateContext({temp:
	Patch(MyDataSource,Defaults(MyDataSource),
		{[your data here]
		}
	)
});

	Patch(MyDataSource,temp,
		{n: RoundDown(Value(temp.Title)/500,0)+1
		}
	)

 

 

Microsoft Employee
@8bitclassroom

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!   This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions in the Forums to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber of SolutionsSuper UsersNumber of Solutions @anandm08  23 @WarrenBelz  31 @DBO_DV  10 @Amik  19 AmínAA 6 @mmbr1606  12 @rzuber  4 @happyume  7 @Giraldoj  3@ANB 6 (tie)   @SpongYe  6 (tie)     Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersSolutionsSuper UsersSolutions @anandm08  10@WarrenBelz 25 @DBO_DV  6@mmbr1606 14 @AmínAA 4 @Amik  12 @royg  3 @ANB  10 @AllanDeCastro  2 @SunilPashikanti  5 @Michaelfp  2 @FLMike  5 @eduardo_izzo  2   Meekou 2   @rzuber  2   @Velegandla  2     @PowerPlatform-P  2   @Micaiah  2     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 3:Community MembersSolutionsSuper UsersSolutionsPower Apps anandm0861WarrenBelz86DBO_DV25Amik66Michaelfp13mmbr160647Giraldoj13FLMike31AmínAA13SpongYe27     Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 4:Community MembersSolutionsSuper UsersSolutionsPower Apps DBO-DV21WarranBelz26Giraldoj7mmbr160618Muzammmil_0695067Amik14samfawzi_acml6FLMike12tzuber6ANB8   SunilPashikanti8

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages