cancel
Showing results for 
Search instead for 
Did you mean: 
Reply

Power Query doesn't use 100% of the processor

Hello,

 

In this example I'm using Power Query in Excel, but I gess it whould be the same on PowerBI.

 

I have currently a very complex query taking an average of 30 minuts to execute, while my processor (Intel I3 10th gen) is at 20% and my memory at 600Mo (8Go in total, 2Go still available).

 

I am wondering if there is a way to tell Power Query to use all the capacity it needs? Currently it's running really slowly, while I have a lot of unused capacity on my computer.

 

I don't have any recursive tasks in my query, all rows could theorically be processed at the same time (that's why I don't think it's a logical bottleneck).

 

The screenshot of my task manager (columns are: process name, processor, memory, hard disk, network):

 

Thank you for your help.

Alexandre

 

13 REPLIES 13

Hi @_AlexandreRM_ ,

 

Really hard to give a definitive answer here, but a few things to think about:

 

- Some of it depends on whether the work you're asking Excel/Power Query to do is serial or parallelable i.e. whether the work has to be done one part after another, or whether it can be threaded across different processing cores. Please don't ask me how to find out if this is the case, I haven't got a clue. It would probably be a question for someone like Ehren.

 

- If you're using a 32-bit version of Excel, then there's a hard RAM cap of 2GB per Windows process instance which, I believe, would then be subdivided across your evaluation containers that handle your query transformations. In this instance, you could consider upgrading to Excel 64-bit to avoid the cap.

 

- If you would consider using Power Query within Power BI Desktop, then you could get both the RAM cap gains by using the 64-bit version, as well as super-charge the RAM usage on the evaluation containers by updating the app MaxEvaluationWorkingSetInMB registry entry. More details on that from Chris Webb here:

https://blog.crossjoin.co.uk/2021/06/06/speed-up-power-query-in-power-bi-desktop-by-allocating-more-... 

 

Pete

Hi @BA_Pete , thank you very much for your tips.

 

- In my opinion all calculations could be paralell. @Ehren , I mention you, if you have a clue, here is my query code (function f is a custom function taking a very long time to complete):

let
    base = Table.Buffer(#"RRP1 + RRP4"),
    baseMaterials = Table.Distinct(base, {"Numéro de produit", "Désignation du produit"}),
    baseMaterialsJoinBase = Table.NestedJoin(baseMaterials, {"Numéro de produit"}, base, "Numéro de produit", "join", JoinKind.LeftOuter),

    bufferJoinTable = Table.TransformColumns(baseMaterialsJoinBase, {"join", Table.Buffer}),

    addMin = Table.AddColumn(bufferJoinTable, "Qty", each f([join], 0, 9999999999, 0))

in
    addMin

 

- I just checked in my account data, I am already on a 64bits version,

 

- This is incredibly powerful, thank you for sharing!!! I defined the parameter in my PowerBI desktop, but I didn't found it in Excel Power Query, it seems Microsoft still didn't pushed this change, sadly.

 

Alexandre

What is the CPU/memory utilization of the Microsoft.Mashup.Container*.exe processes? They're the ones doing the bulk of the PQ work.

 

Also, it's possible the processing is slow because of data source access, which wouldn't show up in CPU/memory.

 

If you remove the addMin step, is everything faster? If so, it would be important to share what the f function does.

Hello Ehren,

 

The CPU/memory utilization of all Excel sub-processes is as follows:

(columns: process name, CPU, memory, hard disk, network)

Note that the CPU utilization is far higher than on my previous screenshot, even if I didn't changed anything. Still, PQ has a good romm for improvement.

 

The data sources are 2 Excel files on the Sharepoint, previously mergued in the #"RRP1 + RRP4" table.

 

Here is the code of function f, which is the reason of the query slowness (a recursive function applying to the [join] table). Its goal is to calculate the minimum value of forecasted stock, using all stocks entries and exits:

 

    f = (codeTable as table, previousQt as number, minQt as number, i as number) => 
    let
        currentQt = previousQt + codeTable{i}[#"Qté entrées/besoins"],
        currentMinQt = if currentQt < minQt then currentQt else minQt,

        //former method, the new one with error handling seems more efficient:
        //result = if i = iMax then minQt else @f(codeTable, currentQt, currentMinQt, i + 1, iMax)
        result = try @f(codeTable, currentQt, currentMinQt, i + 1) otherwise currentMinQt
    in
        result,

 

If you have an idea to improve this function, I would be stronly interested!

 

Alexandre

I'm not entirely sure what the purpose of the minQt calculation is, but other than that the recursion seems unnecessary. Buffering each and every table in the join column independently also seems like it could be detrimental perf-wise.

 

After performing your join, have you tried just summing the "Stock movement qty" column in the nested tables?

 

 

Hello @Ehren , I can't use the aggregate option.

 

Here is an example of how works the function. The final result of the function is in red :

DateStock movement qtyCurrent qtyMin qty
01/01/2023+303030
01/02/2023+104030
01/03/2023-152525
01/04/2023-52020
01/05/2023+406020

 

There are a few ways to do this without recursion, but some of them involve repeatedly summing all the previous rows, which can be inefficient.

 

Here's an approach that only scans the data once.

 

let
    Source = ...,
    #"Sorted Rows" = Table.Sort(Source,{{"Date", Order.Ascending}}),
    #"Added Index" = Table.AddIndexColumn(#"Sorted Rows", "Index", 0, 1, Int64.Type),
    RunningTotal = List.Generate(
    () => [Index = 0, Total = #"Added Index"{0}[Stock movement qty], Min = Total],
    each [Index] < Table.RowCount(#"Added Index"),
    (previous) =>
        let
            newIndex = previous[Index] + 1
        in
            [
                Index = newIndex,
                Total = previous[Total] + #"Added Index"{newIndex}[Stock movement qty],
                Min = List.Min({previous[Min], Total})
            ]),
    Custom1 = Table.AddColumn(#"Added Index", "Current qty", each RunningTotal{[Index]}[Total], type number),
    Custom2 = Table.AddColumn(Custom1, "Min qty", each RunningTotal{[Index]}[Min], type number)
in
    Custom2

 

That said, these kinds of cumulative calculations are probably better done in DAX, after loading the data to the (Power Pivot) Data Model.

Hello @Ehren , sorry for the very late reply, I was in vacation.

I think the List.Generate is a good alternative, I never tried to use it with records to store multiple values at each iteration!

 

After tests, it seems that both solutions (recursivity and list generation) have almost the same (low) performance. I think it's just the fact to iterate manually over each row of a table which is time-consuming. For those who are interested in, here is the formula using List.Generate:

    f = (codeTable as table) => 
        let
            iMax = Table.RowCount(codeTable),
            qtList = List.Buffer(codeTable[#"Qté entrées/besoins"]),

            cumulativeStockList = List.Generate(
                () => 
                    [
                        i = 1, 
                        qt = qtList{0}, 
                        minQt = qtList{0}
                    ], 
                each [i] <= iMax, 
                each 
                    [
                        i = [i] + 1,
                        qt = [qt] + qtList{[i]},
                        minQt = List.Min({[qt] + qtList{[i]}, [minQt]})
                    ])
        in
            List.Last(cumulativeStockList)[minQt],

 

So, except if Microsoft add a way to access more efficiently rows content for custom formulas, I don't think this problem will be solved!

 

I will also take a look at DAX, but its interface in Excel isn't... user-friendly, let's say.

Just curious: do you see a perf difference if you omit the call to List.Buffer?

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!   This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions in the Forums to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber of SolutionsSuper UsersNumber of Solutions @anandm08  23 @WarrenBelz  31 @DBO_DV  10 @Amik  19 AmínAA 6 @mmbr1606  12 @rzuber  4 @happyume  7 @Giraldoj  3@ANB 6 (tie)   @SpongYe  6 (tie)     Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersSolutionsSuper UsersSolutions @anandm08  10@WarrenBelz 25 @DBO_DV  6@mmbr1606 14 @AmínAA 4 @Amik  12 @royg  3 @ANB  10 @AllanDeCastro  2 @SunilPashikanti  5 @Michaelfp  2 @FLMike  5 @eduardo_izzo  2   Meekou 2   @rzuber  2   @Velegandla  2     @PowerPlatform-P  2   @Micaiah  2     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 3:Community MembersSolutionsSuper UsersSolutionsPower Apps anandm0861WarrenBelz86DBO_DV25Amik66Michaelfp13mmbr160647Giraldoj13FLMike31AmínAA13SpongYe27     Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 4:Community MembersSolutionsSuper UsersSolutionsPower Apps DBO-DV21WarranBelz26Giraldoj7mmbr160618Muzammmil_0695067Amik14samfawzi_acml6FLMike12tzuber6ANB8   SunilPashikanti8

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages

Users online (891)