Hello,
I would like to create a line chart that reflects how many items are at a certain location in a particular month.
The x-axis will be month, and the y-axis is the number of items.
The table below is an example of data I have for a particular item.
Item | Date | From | To |
XXXX1 | Wednesday, 20 February 2019 | Home | Part Time Job |
XXXX1 | Friday, 12 April 2019 | Part Time Job | Jobless |
XXXX1 | Wednesday, 15 May 2019 | Jobless | University |
XXXX1 | Friday, 20 December 2019 | University | Part Time Job |
XXXX1 | Thursday, 8 October 2020 | Part Time Job | High School |
XXXX1 | Thursday, 7 September 2023 | High School | Part Time Job |
XXXX1 | Monday, 11 September 2023 | Part Time Job | College Prep |
XXXX1 | Friday, 27 October 2023 | College Prep | University |
XXXX1 | Thursday, 22 February 2024 | University | Work |
You can see the item was at University from 27 October 2023 to 22 February 2024.
At the moment, the line graph only shows how many items entered a location in a month, and item XXXX1 will only appear to be in University in the month of October 2023 because that is the date when it moved to University.
However, I want it to appear in University also for the months of Nov 2023, Dec 2023, Jan 2024 and Feb 2024, because it was in University for those months. In the line graphs below, I have filtered "To" to "University". 10 items entered "University" on 27 October 2023, and another item entered "University" on 12 Dec 2023. However, there should be 11 items altogether in Dec because the 10 items that entered University in October are still there (No items entered in November).
Is this possible?
Hi PBOBOP
You need to tranform your data into a proper fact with Power Query / Transform Data, so that you have one row for each item per day, detailing the current location of the Item.
Try using the M-Code below:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fZJRa8IwEMe/SshzHkxU3B7HpDhBJtjhoPQhrYcNS3vlGgf99ouWbLVrDSG54/K7+9+RJOFchJ2KhL85KL291g78FRFevRhvsU+/pHePcKqgOelWMDVjEWR00dR6Wz776AbLK7rX5FhsSmBbzO7wiMyNlYq91GRsAO8Jwf1poWmmSssl2+nfquGx4B+V+QZqjGtHq3rFa8ihzIAC2yMe6Y6LC3Wln9h77rDLoGYj0jfmXLBDXiDaiRQrdoDaBRlqPoAe6dhh1Q1A/s8xFPKK1sIZ2J6gHh/Hqt/LfIhMjvOvE6X6X0AthgM9In3xNP0B", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"", type text}, {"_1", type text}, {"_2", type text}, {"_3", type text}}),
#"Promoted Headers1" = Table.PromoteHeaders(#"Changed Type1", [PromoteAllScalars=true]),
#"Changed Type2" = Table.TransformColumnTypes(#"Promoted Headers1",{{"Item", type text}, {"Date", type date}, {"From", type text}, {"To", type text}}),
#"Renamed Columns" = Table.RenameColumns(#"Changed Type2",{{"Date", "Start Date of Current Location"}, {"From", "Previous Location"}, {"To", "Current Location"}}),
#"Added Index" = Table.AddIndexColumn(#"Renamed Columns", "Index", 1, 1, Int64.Type),
#"Added Custom" = Table.AddColumn(#"Added Index", "End Date of Current Location", each try #"Added Index" [Start Date of Current Location] {[Index]} otherwise null, Date.Type),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom", each try List.Dates([Start Date of Current Location], Duration.Days([End Date of Current Location] -[Start Date of Current Location]) +1 ,#duration(1,0,0,0) ) otherwise null),
#"Expanded Custom" = Table.ExpandListColumn(#"Added Custom1", "Custom"),
#"Changed Type3" = Table.TransformColumnTypes(#"Expanded Custom",{{"Custom", type date}}),
#"Renamed Columns1" = Table.RenameColumns(#"Changed Type3",{{"Custom", "Date"}})
in
#"Renamed Columns1"
First the I create a new column with an Index starting from 1.
Then I add a new column where I get the Date value of the next row, in order to get Start and End Date of the "To" location on one row.
Then I Create a List of Dates between the Start Date and End Date in a new column, and expand that List to Rows, to get one row for each day:
You can use this table to then count the number of items in each location on each day.
_____________________________________________________
I hope my comment was helpful.
If your question was answered, please mark your post as 'Solved' and consider giving me a 'Thumbs Up'.
Find me on LinkedIn, Sessionize, or my blog Downhill Data
Hi @PBOBOP ,
While @Jonvoge 's solution will make getting daily counts very easy, it will also make your data VERY big VERY quickly.
I would recommend making some small transformations to get the 'To' date on each row like this:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fZFfC4IwFMW/ysVnH9wqrMcoIgJJyCiIHqZecjCdXFfgt28mlvZvT4Od37nn3J1OztEe5rjOAdMCq1TULnAPVhjTVVBt72xmX9c6R+fsvuQrkg8t4zAvSapOGAoyEMkcYaPjAdEbwCYQiKe3FSqsqq/2NsoSE8xjpE6+L+QNqZKmHhBRdqXWfQrbxOiW4N7fTC/Ihx2WphvER01neclgl2RaqwEU6KJtwT6Z36Oejfx+vIZZaKXwghASlj/Scd7/ET5+W8P5Dg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Item = _t, Date = _t, From = _t]),
chgTypes = Table.TransformColumnTypes(Source,{{"Date", type date}}),
sortItemDate = Table.Sort(chgTypes,{{"Item", Order.Ascending}, {"Date", Order.Ascending}}),
addIndex1 = Table.AddIndexColumn(sortItemDate, "Index1", 1, 1, Int64.Type),
addIndex0 = Table.AddIndexColumn(addIndex1, "Index0", 0, 1, Int64.Type),
mergeItemIndex = Table.NestedJoin(addIndex0, {"Item", "Index1"}, addIndex0, {"Item", "Index0"}, "addIndex0", JoinKind.LeftOuter),
expandDateTo = Table.ExpandTableColumn(mergeItemIndex, "addIndex0", {"Date"}, {"DateTo"}),
remOthCols = Table.SelectColumns(expandDateTo,{"Item", "From", "Date", "DateTo"})
in
remOthCols
Then use a measure to get the count of items on any given day, like this:
_itemsInLocationOverTime =
VAR __cDate = MAX(dimCalendar[Date])
RETURN
CALCULATE(
DISTINCTCOUNT(yourTable[Item]),
FILTER(
yourTable,
__cDate >= yourTable[Date]
&& ( __cDate <= yourTable[DateTo] || ISBLANK(yourTable[DateTo]) )
)
)
This assumes you have an UNRELATED calendar table (dimCalendar), and you must use a dimCalendar field on the visual axis along with this measure.
Pete
Thank you so much for your suggestions. I have tried both your methods but there is a problem which is my fault because I should have provided more data. Sorry about that. I have added a few more rows to give better context.
There are many items in this data set. So if we use the index method to create an end date for the period that the item was in a certain location, the End Date for the last location of item XXXX1 will be drawn from the Start Date of the first location of item XXXX2 (orange font below), and the End Date for the last location for XXXX2 will be drawn from the Start Date of the first location of item XXXX3 etc.
However, all the items will be sitting in their latest location today. (XXXX1 is still at Work, XXXX2 is at Home, and XXXX3 is still at Work).
Is it possible to have the end date for the last row for each item to be today's date?
Item | Start Date | From | To | Index from 1 | Index from 0 | End Date |
XXXX1 | 20 February 2019 | Home | Part Time Job | 1 | 0 | 12 April 2019 |
XXXX1 | 12 April 2019 | Part Time Job | Jobless | 2 | 1 | 15 May 2019 |
XXXX1 | 15 May 2019 | Jobless | University | 3 | 2 | 20 December 2019 |
XXXX1 | 20 December 2019 | University | Part Time Job | 4 | 3 | 08 October 2020 |
XXXX1 | 8 October 2020 | Part Time Job | High School | 5 | 4 | 07 September 2023 |
XXXX1 | 7 September 2023 | High School | Part Time Job | 6 | 5 | 11 September 2023 |
XXXX1 | 11 September 2023 | Part Time Job | College Prep | 7 | 6 | 27 October 2023 |
XXXX1 | 27 October 2023 | College Prep | University | 8 | 7 | 22 February 2024 |
XXXX1 | 22 February 2024 | University | Work | 9 | 8 | 06 May 2019 |
XXXX2 | 06 May 2019 | Home | Work | 10 | 9 | 20 January 2021 |
XXXX2 | 20 January 2021 | Work | Home | 11 | 10 | 15 March 2019 |
XXXX3 | 15 March 2019 | Home | Part Time Job | 12 | 11 | 22 May 2019 |
XXXX3 | 22 May 2019 | Part Time Job | Work | 13 | 12 |
|
Hi @BA_Pete,
Thank you for your suggestion,
I am stuck at the mergeItemIndex step, when I merge it, all the numbers in Index1 are there. However after I "expandDateTo", the last line disappears as there is no matching number from Index0, instead of returning "null" like what you have in your table above, and items 109789 and 109790 completely disappear because they only have 1 line in Index1
mergeItemIndex:
expandDateTo:
Do you know how I can get "null" to return for the lines with no match for Index1?
Have you checked that these rows haven't just been re-sorted to the end of the table or something?
There's no reason why a Left Outer merge would remove any rows - that's the point of a Left Outer - but merges in PQ do very often change the sort order of your table, so maybe all the rows with null [DateTo] values have been dumped to the bottom of your table. Try filtering the post-merge query on [DateTo] = null to see if the rows exist.
If these rows do indeed appear to be missing, then you may need to send over some copyable sample data so I can try to reproduce the issue.
Pete
Hi @PBOBOP, regarding this @BA_Pete query. I don't know how many lines do you have in your database, but this one should be faster if you have a lot of lines.
let
FnShift = (tbl as table, col as text, shift as nullable number, optional newColName as text) as table =>
//v 3. parametri zadaj zaporne cislo ak chces posunut riadky hore, kladne ak dole, 4. je nepovinny (novy nazov stlpca)
let
a = Table.Column(tbl, col),
b = if shift = 0 or shift = null then a else if shift > 0
then List.Repeat({null}, shift) & List.RemoveLastN(a, shift)
else List.RemoveFirstN(a, shift * -1) & List.Repeat({null}, shift * -1),
c = Table.FromColumns(Table.ToColumns(tbl) & {b}, Table.ColumnNames(tbl) &
( if newColName <> null then {newColName} else
if shift = 0 then {col & "_Duplicate"} else
if shift > 0 then {col & "_NextValue"}
else {col & "_PrevValue"} ))
in
c,
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fZFfC4IwFMW/ysVnH9wqrMcoIgJJyCiIHqZecjCdXFfgt28mlvZvT4Od37nn3J1OztEe5rjOAdMCq1TULnAPVhjTVVBt72xmX9c6R+fsvuQrkg8t4zAvSapOGAoyEMkcYaPjAdEbwCYQiKe3FSqsqq/2NsoSE8xjpE6+L+QNqZKmHhBRdqXWfQrbxOiW4N7fTC/Ihx2WphvER01neclgl2RaqwEU6KJtwT6Z36Oejfx+vIZZaKXwghASlj/Scd7/ET5+W8P5Dg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Item = _t, Date = _t, From = _t]),
ChangedType1 = Table.TransformColumnTypes(Source,{{"Date", type date}}),
GroupedRowsWithAddDateTo = Table.Group(ChangedType1, {"Item"}, {{"All", each FnShift(_, "Date", -1, "Date To") , type table}}),
Combined = Table.Combine(GroupedRowsWithAddDateTo[All]),
ChangedType2 = Table.TransformColumnTypes(Combined,{{"Date To", type date}})
in
ChangedType2
Dear Community Members, We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd. During this period, members will not be able to Kudo, Comment, or Reply to any posts. On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities. What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!
We are excited to announce the Summer of Solutions Challenge! This challenge is kicking off on Monday, June 17th and will run for (4) weeks. The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions in the Forums to as many questions as you can. Answers can be provided in all the communities. Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”) - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024 Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods. You must enter into each weekly Entry Period separately. How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different. Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers. At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing. Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner. A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random. Individuals will be contacted before the announcement with the opportunity to claim or deny the prize. Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners. Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event. ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE** Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersNumber of SolutionsSuper UsersNumber of Solutions @anandm08 23 @WarrenBelz 31 @DBO_DV 10 @Amik 19 AmínAA 6 @mmbr1606 12 @rzuber 4 @happyume 7 @Giraldoj 3@ANB 6 (tie) @SpongYe 6 (tie) Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Community MembersSolutionsSuper UsersSolutions @anandm08 10@WarrenBelz 25 @DBO_DV 6@mmbr1606 14 @AmínAA 4 @Amik 12 @royg 3 @ANB 10 @AllanDeCastro 2 @SunilPashikanti 5 @Michaelfp 2 @FLMike 5 @eduardo_izzo 2 Meekou 2 @rzuber 2 @Velegandla 2 @PowerPlatform-P 2 @Micaiah 2 Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 3:Community MembersSolutionsSuper UsersSolutionsPower Apps anandm0861WarrenBelz86DBO_DV25Amik66Michaelfp13mmbr160647Giraldoj13FLMike31AmínAA13SpongYe27 Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 4:Community MembersSolutionsSuper UsersSolutionsPower Apps DBO-DV21WarranBelz26Giraldoj7mmbr160618Muzammmil_0695067Amik14samfawzi_acml6FLMike12tzuber6ANB8 SunilPashikanti8
On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners. The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights. Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.
We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for. We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer. Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together. Here's to new beginnings and endless possibilities! If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport. To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages