cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
PJ99
Frequent Visitor

Plugin resource consumption vs Logic apps

We have relatively complex business logic which I propose to code in a series of plugins. 

 

Is there a comparison of the cost of using plugins for frequently executing code vs deploying it in some other way?  I can think of Azure functions, Logic apps or a recurring app running on a virtual or physical machine and I am sure there are other ways also - web job maybe? 

 

Or possibly a Flow in conjunction with something else.  Ease of deployment and testing is another consideration.

 

It strikes me that each option will have a cost of computation and just as an example,  Logic Apps might be 5% of plugin cost, or have additional benefits of not impacting users' experience.  However I have been unable to locate any comparisons or advice in determining how to execute code against the CDS,  or more particularly any indication of how much code is too much code for a plugin.

 

Have you seen any such comparisons or advice on which option to choose when, for code executing against a CDS instance via the SDK? 

 

And is a Logic App the most likely alternative to a plugin or are there others I have missed?

 

 

1 ACCEPTED SOLUTION

Accepted Solutions

@ScottDurow, @ben-thompson thank you - much appreciated.  I think I have an answer now, but just in case you're interested:-

I have a financial computation model I implemented previously in SQL Server where for example an employee's payroll has a number of numeric interrelated values - they could be currency or hours worked etc.  Some of them are constant per pay but the rest are calculated from any number of (parent) input values which are constant or previously calculated values. Or from a timesheet inputted via the Portal.

These granular calculations are exactly like cells in a spreadsheet.  So each time a value changes you know it's dependent (child) calculated values also need to recalculate.

Now you can do this by triggering a plugin on the cell that changes to trigger the (re)calculation of it's child cells, and these updates will in turn trigger the same thing down the line.  Circular references are easily avoided and the post-update event model of plugins will ensure the minimum of cells of the employee's pay are recalculated automatically.

BUT, for each cell there are multiple other parent cell values to retrieve in order to recalculate the target cell. And finally for a large organization this could snowball into more computation than the CDS environment can handle.

To mitigate this, when a cell changes that would trigger this sequence, I could instead either invalidate the employee's pay, or invalidate the pay-run for all employees and then re-implement the trigger logic within the recalculate code and trigger the plugin at the higher level.  But this is now potentially even more load on the CDS database, as well as the plugin mechanism.

I could also reduce the fetches on the parent cell values by hard-coding the payroll logic, however the ability to define the dependencies like this in code means the logic is defined by configuration which allows power users (aka citizen developers!) to define their own logic - per business or even per country.

My current conclusion is that, even if the processing code is instead implemented in say a multi-threaded or multi-instanced worker application and hosted in a super-powerful VM in Azure, ultimately the configuration model above results in too much querying, reading and writing into and out of the CDS.  At that point, caching of the values out of CDS either into memory or a faster database where the code has direct access would be required.  My gut feel is that that requirement invalidates the whole model or at least finally adds back a level of complexity that the model sought to remove in the first place.

View solution in original post

5 REPLIES 5
ScottDurow
Memorable Member
Memorable Member

I think you need to look at the different aspects of 'cost' - they are not really comparable in a direct way.

 

In logic apps you will pay per executed steps - but in Plugins you get each execution 'for free'.

In both Logic Apps and Plugins, calls to CDS have limits that apply which you will need to pay for once you exceed - see the API entitlements - https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/api-limits-overview#entitlement...

 

In logic apps (and azure functions etc.),  you will also need to be aware of the service protection limits that define the number of calls you can make in a given time window (which don't apply inside a plugin) - https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/api-limits-overview#service-pro...

 

Further more, plugins have a maximum execution time of 2 minutes, and so they may not be appropriate for complex workloads.

You may also like to consider Power Automate instead of Logic Apps since it has tighter integration with CDS (transactions) and the license is included in the CDS licenses.

 

If the integration was largely using data outside of CDS and you already have Azure service running, then for large complex scheduled workloads it may be the best place to run in a scheduled logic app, or a scheduled azure function on a dedicated plan so you don't have the execution time limits.

 

If the calculations are entirely dependant on data inside CDS and can be broken down into small transactions that are run as data is created/updated - then plugins are probably going to be the better option.

 

Hope this helps

Scott

 

 

 

Thank you Scott - very much appreciated.  Yes I was trying to get a general conclusion as to which factors are more relevant and which can more be discounted as noise rather than absolute cost.

For example I imagine a solution-aware Flow is easier to deploy than an Azure function, so even though it's execution costs might be higher, it's maintenance costs going forward might be lower.

When you say Plugins you get each execution 'for free', can you think of any other restrictions besides the 2 min timeout?  Maybe we are less able to define cpu resources, or maybe you get quite grunty cpu resources for free as well?

And when we talk about API call numbers for both Entitlements and Service Protection Limits, do ExecuteMultipleRequest and RetrieveMultipleRequestcount as a single call?

 

Thanks again - I really appreciate this response.  It was really helpful.

Paul.

My first thought on your question is actually when and how the code will be triggered and crucially whether the changes need to be immediately reflected back to the user.

 

If the answer to that is a record change is the trigger and the details need to be immediately reflected back than my answer is going to be very different from it's a batch process triggered overnight.

 

When I talk about plugins my first bit of advice is that plugins should contain the logic that in old days you would have wanted within the database - so data validation checks and logical changes that you want to perform. If it's more complex business logic, especially one that requires an code within an existing dll which you need to use an azure function may be a better bet.

 

As for what is too much code for a plugin sadly the answer is that it depends on what you are doing - as running a plugin on a quote or sales order line triggers a whole set of secondary internal plugins that running a single plugin on a custom entity doesn't.

Likewise a database update within a plugin is far more costly than adding another attribute value to the target entity in a pre-operation plugin.

 

So the best approach is very much - one of it depends and it depends on an individual step within your business logic - which I know isn't what you want to hear but is true. Parts of your code may be better as plugins, others as a flow, others as azure functions (due to external logic) and it may even be worth keeping some of it outside CDS using virtual entities.

---
If this post has answered your question please consider it for "Accept as Solution" or if it has been helpful give it a "Thumbs Up".

Hi @PJ99 ,

 

Deploying Flows with solutions is definitely a plus - but if you have connectors other than the Common Data Service - they still need to be managed outside of the solution. If your users have Dynamics or PowerApps licenses already, then they will be licensed for unlimited runs of flows - but you are limited by the number of calls and types of connectors you can use.

  

You have no control over the compute power of Plugins - they often are running in a co-hosted sandbox and will run at a speed depending on how busy the server is. The key point is that plugins should be short running and lightweight in their nature - as soon as you are asking 'how much compute power do I have' and 'how can I make my plugin run faster' - then it probably means your plugin is doing too much.

 

ExecuteMultiple does not count as a call it's self - but the requests inside it's body are each counted individually - so if you make an ExecuteMultiple call with 10 Updates, then it counts as 10 API calls. Of course, if each of those updates fires a plugin which then does other updates, then each of those subsequent updates will also count as a call.

 

The guidance is to never use ExecuteMultiple inside a Plugin - instead use them when you want operations to run as a transaction when run from outside the platform pipeline.

 

Hope this helps,

Scott

@ScottDurow, @ben-thompson thank you - much appreciated.  I think I have an answer now, but just in case you're interested:-

I have a financial computation model I implemented previously in SQL Server where for example an employee's payroll has a number of numeric interrelated values - they could be currency or hours worked etc.  Some of them are constant per pay but the rest are calculated from any number of (parent) input values which are constant or previously calculated values. Or from a timesheet inputted via the Portal.

These granular calculations are exactly like cells in a spreadsheet.  So each time a value changes you know it's dependent (child) calculated values also need to recalculate.

Now you can do this by triggering a plugin on the cell that changes to trigger the (re)calculation of it's child cells, and these updates will in turn trigger the same thing down the line.  Circular references are easily avoided and the post-update event model of plugins will ensure the minimum of cells of the employee's pay are recalculated automatically.

BUT, for each cell there are multiple other parent cell values to retrieve in order to recalculate the target cell. And finally for a large organization this could snowball into more computation than the CDS environment can handle.

To mitigate this, when a cell changes that would trigger this sequence, I could instead either invalidate the employee's pay, or invalidate the pay-run for all employees and then re-implement the trigger logic within the recalculate code and trigger the plugin at the higher level.  But this is now potentially even more load on the CDS database, as well as the plugin mechanism.

I could also reduce the fetches on the parent cell values by hard-coding the payroll logic, however the ability to define the dependencies like this in code means the logic is defined by configuration which allows power users (aka citizen developers!) to define their own logic - per business or even per country.

My current conclusion is that, even if the processing code is instead implemented in say a multi-threaded or multi-instanced worker application and hosted in a super-powerful VM in Azure, ultimately the configuration model above results in too much querying, reading and writing into and out of the CDS.  At that point, caching of the values out of CDS either into memory or a faster database where the code has direct access would be required.  My gut feel is that that requirement invalidates the whole model or at least finally adds back a level of complexity that the model sought to remove in the first place.

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!   This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions in the Forums to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber of SolutionsSuper UsersNumber of Solutions @anandm08  23 @WarrenBelz  31 @DBO_DV  10 @Amik  19 AmínAA 6 @mmbr1606  12 @rzuber  4 @happyume  7 @Giraldoj  3@ANB 6 (tie)   @SpongYe  6 (tie)     Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersSolutionsSuper UsersSolutions @anandm08  10@WarrenBelz 25 @DBO_DV  6@mmbr1606 14 @AmínAA 4 @Amik  12 @royg  3 @ANB  10 @AllanDeCastro  2 @SunilPashikanti  5 @Michaelfp  2 @FLMike  5 @eduardo_izzo  2   Meekou 2   @rzuber  2   @Velegandla  2     @PowerPlatform-P  2   @Micaiah  2     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 3:Community MembersSolutionsSuper UsersSolutionsPower Apps anandm0861WarrenBelz86DBO_DV25Amik66Michaelfp13mmbr160647Giraldoj13FLMike31AmínAA13SpongYe27     Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 4:Community MembersSolutionsSuper UsersSolutionsPower Apps DBO-DV21WarranBelz26Giraldoj7mmbr160618Muzammmil_0695067Amik14samfawzi_acml6FLMike12tzuber6ANB8   SunilPashikanti8

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages

Users online (702)