cancel
Showing results for 
Search instead for 
Did you mean: 
Reply

Reset count of ID once they've all been seen, count how many days it took then reset until next time

Hi,

 

I have 1500 unique IDs, everyday many are seen and some multiple times a day.

 

I would like to do a cumulative count and know how many days does it takes to see them all 1500, once it reaches that amount, the cumulative sum needs to reset for another count until the next time the goal is met.

 

I've done a running total after a grouped count by date, but my issue aside from the needed reset is that the duplicates needs to be removed based on the time period needed to attain the goal. If i'm grouping my IDs by date then i'm losing the ability to sort out duplicates...

 

So i'm guessing I need to first find how much time is needed, then remove the duplicates from a custom column spanning over its own run tower the goal, and then, calculate the running...

 

Here's an ugly paint mockup of what i'd like my final result to look like (I realize I need my date format to be date.time otherwise i'll never get a correct goal and it'll be tricky)

Kiwizqt_0-1600449194030.png

 

 

I've found this solution here from Lind25 but the syntax is wrong, maybe from a previous PQ version:

https://community.powerbi.com/t5/Desktop/Running-total-that-resets-when-equal-to-or-greater-than-a-s...

 

 

let
    Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
    AddedIndex = Table.AddIndexColumn(Source, "Index", 1, 1),
    #"Added Custom" = Table.AddColumn(AddedIndex, "Running total reset", each List.Accumulate(List.FirstN(AddedIndex[Difference],[Index]),0,(state,current)=>if state+current > 60 then 0 else state+current))
in
    #"Added Custom"

 

 

 

Any takers ? I'm completely lost on that one...

 

Thanks

 

4 ACCEPTED SOLUTIONS

Accepted Solutions

@Kiwizqt 

I've simulated a scenario with only 10 IDs using some of the data you provided, see if I'm on the right track? If I'm right, I'll think of the code to implement the end result.

Sample PQ 

View solution in original post

@Kiwizqt 

I did a test performing the core operation in Python and it seems to finish in less than 20 secs for the data you posted. See it in the attached file. You'll have to update the path to the excel file you shared. Here is the M code for the main query:

Note it should be cleaned up a bit, as it is doing a lot of stuff that might not be necessary

 

 

let
    Source = Excel.Workbook(File.Contents("d:\Downloads\Sample PQ Help.xlsx"), null, true),
    Maintenance_Itinérante___CSV_Table = Source{[Item="Maintenance_Itinérante___CSV",Kind="Table"]}[Data],
    #"Changed Type" = Table.TransformColumnTypes(Maintenance_Itinérante___CSV_Table,{{"Date&Time Seen", type datetime}, {"ID", Int64.Type}}),
    #"Removed Duplicates" = Table.Distinct(#"Changed Type", {"Date&Time Seen", "ID"}),
    #"Sorted Rows" = Table.Sort(#"Removed Duplicates",{{"Date&Time Seen", Order.Ascending}}),
    #"Added Custom" = Table.AddColumn(#"Sorted Rows", "Date", each Date.From([#"Date&Time Seen"])),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Date&Time Seen"}),
    #"Reordered Columns" = Table.ReorderColumns(#"Removed Columns",{"Date", "ID"}),
    #"Removed Duplicates1" = Table.Distinct(#"Reordered Columns", {"Date", "ID"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Removed Duplicates1",{{"Date", Int64.Type}}),
    #"Run Python script" = Python.Execute("# 'dataset' holds the input data for this script#(lf)groupeddataset = dataset.groupby(['Date'])['ID'].apply(lambda x: list(x)).to_frame().reset_index()#(lf)#test.groupby(['Pos'])['Pos2'].apply(lambda x: list(x)).to_frame().reset_index()#(lf)a = list(groupeddataset['ID']) #(lf)acc=list(initial['ID']); res=[]#(lf)for i in range(len(a)):#(lf)    acc=set(acc)-set(a[i])#(lf)    #acc=set(acc)-set([a[i]])#(lf)    if acc == set(): #(lf)        acc=initial#(lf)        res=res+[i]#(lf)#(lf)output=pandas.DataFrame(res,columns=['Positions'])",[dataset=#"Changed Type1", initial=Table.SelectRows(All_IDsT,each [ID]<> 15133)]),
    groupeddataset = #"Run Python script"{[Name="groupeddataset"]}[Value],
    groupeddataset2 = Table.TransformColumnTypes(groupeddataset,{{"Date", Int64.Type}}),
    #"Changed Type3" = Table.TransformColumnTypes(groupeddataset2,{{"Date", type date}}),
    #"Sorted Rows1" = Table.Sort(#"Changed Type3",{{"Date", Order.Ascending}}),
    CompletionPositionsT = #"Run Python script"{[Name="output"]}[Value],
    CompletionPositionsT2 = Table.TransformColumnTypes(CompletionPositionsT,{{"Positions", Int64.Type}}),
    result = List.Select(groupeddataset2[Date], each List.Contains(CompletionPositionsT2[Positions],_ - List.Min(groupeddataset2[Date]))),
    #"Converted to Table" = Table.FromList(result, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #"Changed Type2" = Table.TransformColumnTypes(#"Converted to Table",{{"Column1", type date}}),
    #"Renamed Columns" = Table.RenameColumns(#"Changed Type2",{{"Column1", "Completion dates"}})
in
    #"Renamed Columns"

 

 

The main step is #"Run Python script", with the following Python code:

 

 

groupeddataset = dataset.groupby(['Date'])['ID'].apply(lambda x: list(x)).to_frame().reset_index()
a = list(groupeddataset['ID']) 
acc=list(initial['ID']); res=[]
for i in range(len(a)):
    acc=set(acc)-set(a[i])
    if acc == set(): 
        acc=initial
        res=res+[i]

output=pandas.DataFrame(res,columns=['Positions'])

 

 

It groups the IDs by Dates (day level) and then operates on that to extract a list of the positions where each section with all IDs seen ends. Note I have filtered out ID 15133 from the list of IDs so that there is at least one section that has all IDs

Please mark the question solved when done and consider giving kudos if posts are helpful.

Contact me privately for support with any larger-scale BI needs, tutoring, etc.

Cheers 

 

SU18_powerbi_badge

 

 

 

View solution in original post

As much a I love M, I will put another plug in for the DAX approach for this one.  I don't know if your IDs have category columns associated with them and you will want to also have slicers (which would require a DAX approach).  Attached is a pbix with your example data.  It uses the below DAX expression to generate the shown table (I also added a small table with the "Cycle" values of 1,2,3).  You could do them separate but it is calculation intensive (so since I already had the virtual table in the measure, I generated both outputs and concatenated them together).

 

mahoneypat_0-1600737218749.png

Time and Span for Completion =
VAR thiscycle =
    SELECTEDVALUE ( Cycle[Completion Cycle] )
VAR ids =
    ALLSELECTED ( Seen[ID] )
VAR idcount =
    COUNTROWS ( ids )
VAR summarylastcycle =
    ADDCOLUMNS (
        VALUES ( Seen[Date&Time Seen] ),
        "IDsSoFar",
            VAR thistime = Seen[Date&Time Seen]
            RETURN
                COUNTROWS (
                    FILTER (
                        ids,
                        CALCULATE (
                            COUNT ( Seen[ID] ),
                            Seen[Date&Time Seen] <= thistime
                        ) >= thiscycle - 1
                    )
                )
    )
VAR completiontimelastcycle =
    IF (
        thiscycle = 1,
        MIN ( Seen[Date&Time Seen] ),
        MINX (
            FILTER (
                summarylastcycle,
                [IDsSoFar] >= idcount
            ),
            Seen[Date&Time Seen]
        )
    )
VAR summarythiscycle =
    ADDCOLUMNS (
        FILTER (
            VALUES ( Seen[Date&Time Seen] ),
            Seen[Date&Time Seen] >= completiontimelastcycle
        ),
        "IDsSoFar",
            VAR thistime = Seen[Date&Time Seen]
            RETURN
                COUNTROWS (
                    FILTER (
                        ids,
                        CALCULATE (
                            COUNT ( Seen[ID] ),
                            Seen[Date&Time Seen] <= thistime
                        ) >= thiscycle
                    )
                )
    )
VAR completiontimethiscycle =
    MINX (
        FILTER (
            summarythiscycle,
            [IDsSoFar] >= idcount
        ),
        Seen[Date&Time Seen]
    )
VAR span =
    DATEDIFF (
        completiontimelastcycle,
        completiontimethiscycle,
        DAY
    )
VAR range = completiontimelastcycle & " - " & completiontimethiscycle
RETURN
    span & " days" & "; " & range

 

Regards,

Pat

 

 

View solution in original post

the code that implements the algorithm I described in the previous message

 

 

 

 

 

 

let
    completion = (tab)=>
    let
    grpID=Table.Group(tab, {"ID"}, {"grp", each _}),
    nids=Table.RowCount(grpID),
    currLastDate=List.Min(List.Last(grpID[grp])[Date]),
    rest= Table.SelectRows(tab, each _[Date] > currLastDate ),
    result= if Table.RowCount(Table.Distinct(rest,"ID")) < nids then {currLastDate} else {currLastDate} & @ completion(rest)  
    in 
    result 
in
    completion

 

 

 

 

 

This function  receives as input a table with Date and ID columns and provides a list of dates where the saturation cycle of all the distinct ids of the table is completed.

 

 

It seems very fast, respect previou solution based on list.generate and list.difference and so on ...

 

 

to obtain a more pleasant output, I grafted a copy of the vector of ids in the original table in three different random points of the [ID] column  😊

 

 

image.png

 

 

 

 

 

 

 

View solution in original post

48 REPLIES 48

Hi @Kiwizqt 

I don't quite understand.

Can you show

1. the original data you are working on (an excerpt that shows all the relevant fields)?

2. the result you expect from the data on 1 and how you get there

 

Please mark the question solved when done and consider giving kudos if posts are helpful.

Contact me privately for support with any larger-scale BI needs, tutoring, etc.

Cheers 

 

SU18_powerbi_badge

Hi @AlB thank you for your time and sorry not to have explained it better.

 

My answer got removed for suspission of spam 😞 here's another go at it.

 

Here's what i'm working with:

tfMThDx

And here's what I'd like to know about it:

eO9UgJo.png

Where Completion Cycles refer to the moment I choose to start counting ID toward the 1500 IDs needed to be seen,

Where Days to Achieve refer to the time it took to reach those 1500 unique IDs,

And ideally, a Time Span column telling me the time period those completion cycles took place in.

 

The trick is that I'm registering duplicates IDs on that road to 1500 uniques and that the next completion cycle needs to start as soon as the previous one is done toward a new 1500 unique IDs goal.

 

I do have another table where my 1500 total IDs are stored if a join is needed.

 

Note: I don't have much knowledge in DAX/PBI but if you think it's easier or you don't think it's doable in PQ, I could also upload my source data and make it work there.

 

 

This is easier done in DAX in my opinion.  Here is a measure expression you can try in a card visual to give the first (min) datetime where all the IDs have been seen.  To see the 2nd and 3rd time, just change the 1 to 2 or 3.  I made some sample data and called that table Seen, so change that throughout to your actual table name.  Note there is a lot of calculation going on here, so it may not be performant at large scale.  If you try it, please let me know how it performs.

 

First All IDs Seen =
VAR ids =
    ALLSELECTED ( Seen[ID] )
VAR idcount =
    COUNTROWS ( ids )
VAR summary =
    ADDCOLUMNS (
        VALUES ( Seen[Date & Time Seen] ),
        "IDsSoFar",
            VAR thistime = Seen[Date & Time Seen]
            RETURN
                COUNTROWS (
                    FILTER (
                        ids,
                        CALCULATE (
                            COUNT ( Seen[ID] ),
                            Seen[Date & Time Seen] <= thistime
                        ) >= 1
                    )
                )
    )
RETURN
    MINX (
        FILTER (
            summary,
            [IDsSoFar] >= idcount
        ),
        Seen[Date & Time Seen]
    )

 

Regards,

Pat

 

 

 

= let

// {1..6} is the set of IDs
    Source = List.Accumulate(Tabl[Index], 
        [dcompl={[when=Tabl[Date]{0},atPos=Tabl[Index]{0}]},ck={1..6}],
        (s,c)=>  if List.IsEmpty(s[ck]) then [dcompl=s[dcompl]&{[when=Tabl[Date]{c},atPos=c]},ck={1..6}] else  [dcompl=s[dcompl],ck=List.Difference(s[ck],{Tabl[id]{c}})])
in
    Table.FromRecords(Source[dcompl])

 

the main difficulty seems to be the presence of duplicate IDs before filling up.
This can be overcome by using the concept of union of sets, with the List.Union function.
It can be done in many ways,
If you place some tables that can be copied easily I can try it.
if not try to adapt this script or the equivalent made with list generate.
 
########

basically with the same logic, but this implementation is much more efficient, as it avoids continuous list sorting and comparisons are simpler (IsEmpty I think is much faster than list1 = list2)

 

 

 

let
    Source = List.Accumulate(Tabl[Index], 
        [dcompl={[when=Tabl[Date]{0},atPos=Tabl[Index]{0}]},ck={1..6}],
        (s,c)=>  if List.IsEmpty(s[ck]) then [dcompl=s[dcompl]&{[when=Tabl[Date]{c},atPos=c]},ck={1..6}] else  [dcompl=s[dcompl],ck=List.Difference(s[ck],{Tabl[id]{c}})])
in
    Table.FromRecords(Source[dcompl])

 

 

 

 

 

 

 
 
 

@mahoneypatthanks for the try but the card only returns the date of completion (correctly I might add, I checked it), but it doesn't count the numbers of days needed.

 

Ideally, I would have liked to plot it in on a line graph with the numbers of IDs on (y) and raw days on (x) so that I could superimpose them throughout various periods.

 

Hi @Rocco_sprmnt21 ! Thank you for your time, i'll be honest I can't decypher much of your code but i'm eager to learn from it, i've hosted an excel on googledrive below, is that enough/available for you ?

 

https://drive.google.com/file/d/1gWHYl3FxX3t2Dt6jQ8MCBWV1WF4d80V0/view?usp=sharing

 

 

I thought the finish time was one of your requests, but you can easily adapt the expression I suggested to get the time it took as follows.  You can choose another time dimension instead of MINUTE (DAY, HOUR, etc.).  Also, for cycles 2 and higher you would need to set the starttime variable to the same expression but with cycle minus 1. 

 

Time for First Seen =
VAR ids =
    ALLSELECTED ( Seen[ID] )
VAR idcount =
    COUNTROWS ( ids )
VAR summary =
    ADDCOLUMNS (
        VALUES ( Seen[Date & Time Seen] ),
        "IDsSoFar",
            VAR thistime = Seen[Date & Time Seen]
            RETURN
                COUNTROWS (
                    FILTER (
                        ids,
                        CALCULATE (
                            COUNT ( Seen[ID] ),
                            Seen[Date & Time Seen] <= thistime
                        ) >= 1
                    )
                )
    )
VAR completiontime =
    MINX (
        FILTER (
            summary,
            [IDsSoFar] >= idcount
        ),
        Seen[Date & Time Seen]
    )
VAR startime =
    MIN ( Seen[Date & Time Seen] )
RETURN
    DATEDIFF (
        startime,
        completiontime,
        MINUTE
    )

 

Regards,

Pat

 

Hi @Kiwizqt 

Given the size of your data, the List.Accumulate function is not good.

You could try List.Generate (but try incrementally (*) ) or try the path indicated by
@mahoneypat .

 

try list.generate starting from small lists and gradually increasing the size to evaluate the times.

I propose you a draft, which calculates only the essential. Returns a vector with all nulls except the values of the position where the breaks occur.
You can use this list (filtered by null values) to find the affected rows.

 

 

 

 

 

let
    //idsV is a partial list of all ids
    // Seen is a partial list (20k-items, f.i.) of overall ids into Seen table
    ids=List.Buffer(idsV),
    s=List.Buffer(Seen),
    nID=List.Count(Seen),
    Source = List.Generate(
        ()=>[atPos = null,ck=ids,idx=0],
        each [idx] < nID,
        each  if List.IsEmpty([ck]) then [atPos= idx,ck=ids,idx=[idx]+1] else  [atPos=null,ck=List.Difference([ck],{s{[idx]}}),idx=[idx]+1],
        each [atPos])
in
    Source

 

 

 

 

 

 

 

(*)

at worst, you have 1500 ids and 100K rows, then for each of the 100k ids in the table it is necessary to "traverse" the whole vector of distinct ids 1,5M check.

 

If trying a list of 20k and 100 distinct ids it takes 1 ', then for your table it takes 100k / 20k * 1500/100 = about 75' (1H, as an order of magnitude of course)

 

 

Hi, @Kiwizqt 

In the two worksheets you provided, the number of non-duplicate IDs in the seen worksheet is 1698, but the number of IDs in the Total ID worksheet is 1374.

Can you re-provide a correct form file?

Hi @ziying35 sorry for the inconvenience, I cleared wrong datas by applying an inner merge. The 324 ID excess came from wrong inputs from users and I thought it didn't matter for the solution as I would've done that merge before any solution given here, or that I could manually edit the m-code.

 

Bellow the finite data:

https://drive.google.com/file/d/1gWHYl3FxX3t2Dt6jQ8MCBWV1WF4d80V0/view?usp=sharing

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!   This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions in the Forums to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber of SolutionsSuper UsersNumber of Solutions @anandm08  23 @WarrenBelz  31 @DBO_DV  10 @Amik  19 AmínAA 6 @mmbr1606  12 @rzuber  4 @happyume  7 @Giraldoj  3@ANB 6 (tie)   @SpongYe  6 (tie)     Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersSolutionsSuper UsersSolutions @anandm08  10@WarrenBelz 25 @DBO_DV  6@mmbr1606 14 @AmínAA 4 @Amik  12 @royg  3 @ANB  10 @AllanDeCastro  2 @SunilPashikanti  5 @Michaelfp  2 @FLMike  5 @eduardo_izzo  2   Meekou 2   @rzuber  2   @Velegandla  2     @PowerPlatform-P  2   @Micaiah  2     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 3:Community MembersSolutionsSuper UsersSolutionsPower Apps anandm0861WarrenBelz86DBO_DV25Amik66Michaelfp13mmbr160647Giraldoj13FLMike31AmínAA13SpongYe27     Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Week 4:Community MembersSolutionsSuper UsersSolutionsPower Apps DBO-DV21WarranBelz26Giraldoj7mmbr160618Muzammmil_0695067Amik14samfawzi_acml6FLMike12tzuber6ANB8   SunilPashikanti8

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages

Users online (1,257)