# Solutions for Power Query Challenge 6

This morning I logged in to check the solutions for Power Query Challenge 6 that were submitted and... Wow!  There were a bunch, and some cool variety there.  So now it's time to show you all what I came up with here.

# What was Power Query Challenge 6?

The full description and source data can be found in yesterday's post, but in short it was to convert this:

Data table with multiple data points per cell

To this:

Data shown in First Normal Form (1NF)

So how do we do it?

# Two Solutions for Power Query Challenge 6

Wait, two solutions?  Why?

As it turns out, I put together two for this one. My first attempt was cooked up to solve the issue on short notice.  Then I built another that seems a bit more elegant.  So I'll show them both, although quickly.

## Solution 1 - Splitting Individual Columns and Merging Back Together

The first of my solutions for Power Query Challenge 6 is actually quite similar to what Ali posted in the solution thread.  It basically follows this method:

• Step 1:
• Create a Data query that connects to the source data, and load it as connection only
• Step 2:
• Create 3 new queries for ItemID, Quantity and Price which
• Reference the data query
• Keep the InvoiceID column and the other relevant column
• Split the relevant column by delimiter, ensuring it splits to rows (not columns as it defaults to)
• Add an Index column
• Step 3:
• Reference one of the Step 2 tables, and merge the other two tables to it, based on matching the Index column in each

So when complete the query chain looks like this:

And returns the table we're after:

The only real trick to this one is that - when you are building the Price query - the Price column will pick the decimal as the delimiter, so you have to force it to a line feed.  So building the Price query would go through the following steps:

• Right click the Data query --> Reference
• Select the InvoiceID and Price columns --> Right click --> Remove Other Columns
• Right click the Price column --> Split column --> By Delimiter
• Clear the decimal from the Custom area
• Click the arrow to open the Advanced area
• Change the selection to split to Rows
• Check "Split using special characters"
• Choose to insert a Line Feed character
• Click OK
• Set the Data Types
• Go to Add Column --> Add Index Columns

Resulting in this:

The ItemID and Quantity queries follow the same steps, except that Power Query now correctly identifies the Line Feed as the character to split on.

## Solution 2 - Group and Split

While the first solution to Power Query Challenge 6 worked, it left me less than satisfied as it took a bunch of queries.  While effective, it didn't feel elegant.  So I cooked up another solution that uses Grouping.  It's actually quite similar to the first solution that Bill Szysz posted.

The basic method is as follows:

• Connect to the data
• Right click the InvoiceID column --> UnPivot Other Columns
• Right click the Value column --> Split Column --> By Delimiter --> OK

Following the steps above gets us to this state:

To unwind this, we group it:

• Go to Transform --> Group By
• Group By InvoiceID, Attribute
• Aggregate a "Data" column using the All Rows operation

At this point, we need to number these rows, so I just busted out the pattern to do that from our Power Query Recipe cards (recipe 50.125).

• Go to Add Column --> Custom
• Column Name:  Custom
• Formula:  =Table.AddIndexColumn( [Data], "Row", 1, 1)
• Right click the Custom column --> Remove Other Columns
• Expand all fields from the Custom column

Leaving us with this data:

The next step is to Pivot it:

• Select the Attribute column --> Transform --> Pivot Column
• Choose Value for the Values
• Expand the Advanced arrow
• Change the Aggregation to "Don't Aggregate"
• Click OK
• Select the "Row" column and Remove it.  (Yes, it was needed to unpivot this correctly, but now adds no value.)
• Set the data types
• Load it to the desired destination

At this point, the query (again) looks perfect:

Now, I must admit, this felt far more professional and left me feeling good about it.

# Which Solution to Power Query Challenge 6 is Better?

Naturally, solution 2 is better.  It takes less queries, and looks way cooler.  Right?  Not so fast...

The real question is in the performance.  And for this one I thought I'd test it.  But I needed more data.  I expanded the set to 11,000 rows and then used a tool we're working on to time the refreshes.  Privacy was left on, and all times shown are in seconds:

• Solution 1:  1.43, 1.48, 1.11, 1.27  Avg ~1.32 seconds
• Solution 2:  2.77, 2.65, 2.63, 2.68  Avg ~2.68 seconds

I'll be honest, this surprised me.  So I went back and added the GroupKind.Local parameter into the Grouped Rows step, like this (as that often speeds things up):

Table.Group(#"Changed Type1", {"InvoiceID", "Attribute"}, {{"Data", each _, type table [InvoiceID=number, Attribute=text, Value=number]}}, GroupKind.Local)

The revised timing for Solution 2 now gave me this:

• Solution 2A:  2.54, 2.49, 2.56, 2.61.  Avg ~2.55 seconds

So while the local grouping did have a small impact, the message became pretty clear here.  Splitting this into smaller chunks was actually way more efficient than building a more elegant "all in one" solution!

My solution (including 5,000 rows of the data), can be found in the solution thread here.

# Power Query Challenge 6

Are you ready for Power Query Challenge 6?  In this challenge we'll look at splitting nested data sets into a nice table in First Normal Form.

# Let's take a look at Power Query Challenge 6:

## Where did the challenge come from?

The inspiration for this challenge came from last week's Self Service BI Boot Camp that we hosted in Vancouver, BC (you should have been there, it was awesome).  At the outset, we talked about how to identify if your data is in First Normal Form (the format that is ideal for a PivotTable), and I showed this data set:

Data table with multiple data points per cell

Notice how the invoice has multiple ItemID, multiple Quantity and multiple Price fields per cell?  That's not good at all.

## What we Really Need - Data in First Normal Form

As I explained in the Boot Camp, it is essential that the data source be in First Normal Form in order for a PivotTable to consume it.  What does that mean?  It means that it needs to look like this:

Data shown in First Normal Form (1NF)

Notice that in this case the data is atomic - it only has one data point per cell.  In addition, there is now one complete record per row.  The InvoiceID shows on every row now, and each data point has been correctly split up and applied to them.

# So what's the thrust of the Power Query Challenge 6?

Well, as it turns out this is a bit tricky.  There are a few issues at play here:

• The data is separated by line feeds within the cells
• There are a different number of line feeds in each row
• At least the number of line feeds is consistent for each cell in the entire row though!

So the challenge is this: break the table apart so that the data is in First Normal Form like the second image.

You can download the data source file (and post your solutions) in our forum here.  I'll give you my version tomorrow.

# Power Query Intellisense and Highlighting in Excel

I'm super excited that Power Query Intellisense and Highlighting features are finally here in Excel!  It's been a long time coming, with these features debuting in Power BI Desktop months ago.

# Where do Intellisense and Highlighting Show up?

Let's look at all three places that Intellisense and Highlighting are now exposed in Excel's Power Query.

## Intellisense and Highlighting in the Formula Bar

The first thing you should notice about the image below is that Excel now adds colour highlighting to the text in the formula bar.  "Text" values are shown in red, keywords in blue, and data types in green:

In addition, you can see Intellisense showing in the picture above as well.  In this case the cursor is immediately after the opening parenthesis character.  Unlike in the past, we now get a nice syntax popup box that tells us what needs to come next in this function.

## Intellisense in Custom Columns

This is the place where I really wanted to see this happen.  Here's a look at Intellisense working live as I started to build a column using a Text function:

Pretty sweet, as this actually starts to display the options.

(Now, to be fair, in my build of Excel, this feature only seems to consistently work if I rename the column first, then tab into the formula builder.  If I just click straight into the formula, bypassing the tab order, it doesn't seem to kick in.)

## Syntax Highlighting in Custom Columns

How about syntax highlighting?  The formula here is only meant for demo purposes, but you get the idea of how it can look:

Blue keywords for if, then, else.  Hard coded values (like 3) show up in green like data types, and text is showing in red.

## Intellisense and Highlighting in the Advanced Editor

And finally, Intellisense and highlighting work in the Advanced Editor as well.  Here's a look at the syntax highlighting:

And here's what happens when we start manually adding a new step, which kicks in the Intellisense in the Advanced Editor window:

# How do you get these features?

First, you'll need to be on Office 365.  Sorry Excel 2016 and 2019, but my understanding is that these new features are only coming to subscription users.  (That's another reason that - in my opinion - you should never buy another 4 digit numbered release of Excel ever again.)

If you are on subscription, you get them with your regular updates.  This feature set hit the Insider build channel last month in version 1907.  It's currently also been deployed to those on the Monthly Targeted channel running version 1907, build 11901.20080 or newer.

If you're not on the Monthly Targeted channel, you'll need some patience, as it's coming.  Just keep checking those updates!

# Microsoft Business Applications Summit Recordings

Ken had a blast in June going down to Atlanta for the Microsoft Business Applications Summit (MBAS). There, he and co-author Miguel Escobar led an in-person workshop together for the first time. Ken also presented a breakout session on best practices for spreadsheet modeling, data shaping and data analysis. However, his real highlight was helping people with their Excel questions at the Ask the Experts booth.

Ken Puls hanging out at the Ask The Experts booth with his 'M is for Data Monkey' co-author Miguel Escobar and Excel legend Bill Jelen (aka Mr. Excel).

At MBAS, Microsoft also unveiled their new Power Query website. It's wonderful to finally have an official Microsoft site dedicated to this amazing tool. In addition, we're extremely proud that the Excelguru blog, along with the books and website created as part of our Power Query Training project, are listed in the Resources section!

We are thrilled to be included on the Resources page of the new Power Query official website!

# On-demand Session Recordings

If you weren't able to make it to MBAS, or didn't get to all the sessions you wanted to, Microsoft has tons of on-demand recordings available for FREE! This is an amazing resource to help you continue to learn about and explore the Microsoft ecosystem. You can check them all out at the Microsoft Business Applications Summit website. Microsoft broke the sessions down into 4 streams: Power BI, PowerApps, Microsoft Flow, and Microsoft Dynamics.

# Excel Sessions

Microsoft included Excel sessions in the Power BI stream under the "Other" category. Thus, you may find them a bit tricky to find. Luckily the Excel team gathered together a list of available recordings that we wanted to share. This includes Ken's session with David Monroy of Microsoft on data modeling and analysis best practices:

# Power Query Sessions

The Power Query sessions at Microsoft Business Applications Summit were also part of the Power BI stream. Hence, we've compiled a list of available recordings to make them easier to find:

Unfortunately, we do not have a recording of Ken and Miguel's workshop on Working with Data in the Power Platform and Excel.

# Microsoft Business Applications Summit 2020

Mark your calendar - next year's event will be held in Anaheim, CA on April 20 and 21. Additionally, you can sign up to receive updates for MBAS 2020 info on the event site. Perhaps we'll see you there!

# One Solution to Power Query Challenge 5

Hopefully you’ve had a chance to try and build a solution to Power Query Challenge 5.  In this thread, I’ll show you how I approached it.

# Challenge Goals

The challenge in this case was to allocate a fixed amount for a series of account IDs based on the data in this list:

Can you make a series from one to the other?

Creating this…

Here is the data we need to create

And as a reminder, the rules were:

• The first 7 digits won’t change for the series
• If there is no “To” value, then we only need the single value
• If the final character is text, it will never exceed Z. (i.e. we’ll never go from 10A to 11C)

And you can click here to get the original data file (as well as see solutions posted by myself and others).

I would also like to quickly restate that another of my main goals was to do as much using the UI as possible, and make certain that the students in my workshop could maintain this solution.  During the three day workshop (similar to my upcoming self service BI bootcamp), we did a full day of Power Query work, but mainly around reshaping data through the user interface.  The deepest we got into M code was creating a list using a structure like {1..10}.  So moving to crazy conditional logic and List.Generate solutions was out of the scope of where I wanted to take this for them. 😉

# One solution to Power Query Challenge 5

I broke my solution down into four distinct queries as follows:

Let’s look at each of those steps.

## The Base Query

The point of this query was basically to carve up the “From” and “To” portions into the components that I would need in order to get my solution.  With the preface remaining the same for every ID in the list, it was all about separating the last 3 digits to go from this table:

To this one:

The steps I used were as follows:

• Select the [From] column --> Add Column --> Extract --> First Characters --> 7
• Name the new column “Preface”
• Select the [From] column --> Add Column --> Extract --> Last Characters --> 3
• Name the new column “From3”
• Add a new conditional column called “To3” with the following formula:
• =if [To] = null then [From3] else Text.End([To],3)
• Remove the [From] and [To] columns
• Set the data types for each column (notice [From3] and [To3] are text, not numbers)
• Load the query as a Connection Only query

## The Numeric Query

Next up, I needed to generate the series for the numeric series.  This was easy, as it works similar to the Power Query Calendar table recipe.  (Recipe card 60.105.x which you can find in our Power Query Recipe Cards)

The steps I used do this were:

• Reference the Base query
• Change the type of the [From3] column to Whole Number
• Filter the [From3] column to remove errors
• Added a new Custom Column with the following formula:
• ={ [From3]..[To3] }
• Removed the [From3] and [To3] columns
• Expanded the new [Custom] column
• Merged the [Preface] and [Custom] columns into a new “ID” column
• Loaded this query as Connection only

This left me with the query output shown here:

And the cool this is that using this technique, the final series – which only had a “From” ID and no “To” ID just creates the single item

## The Alpha Query

Next up was the alpha query.  Once again, I referenced the Base query, meaning that I started from here:

The steps I followed here were:

• Duplicate the [From3] column
• Change the data type on the [From3 – Copy] column to Whole Number
• Filtered the [From3 – Copy] column to KEEP errors
• Removed the [From3 – Copy] column
• Split the [From3] column by Number of Characters, splitting at the left-most 2 characters
• Split the [To3] column by Number of Characters, splitting at the left-most 2 characters
• Added a new Custom Column with the following formula
• = { [From3.2]..[To3.2] }

So at this point, I’m looking at this:

Continuing on, I…

• Removed columns [From 3.2], [To 3.1] and [To 3.2]
• Expanded the list from the [Custom] column
• Merged the [Preface] , [From3.1] and [Custom] columns into a new “ID” column
• Loaded this query as Connection only

And these steps left me with a nice list of the alphanumeric series as follows:

## The IDs Query

At this point, things became super simple.  There’s no real magic to this query, as the data has already been prepared and normalized in the Numeric and Alpha queries.  So all that needs to be done here is:

• Reference the Numeric query
• Append the Alpha query
• Set the data types (just to be sure)
• Load the query to the intended destination

And that’s it.

# That’s One Solution to Power Query Challenge 5

Of course, there are countless other ways to solve this.  My goal in this case was specifically to avoid a custom function, as I didn’t want to obfuscate the code for a new Power Query user.  Could I have done this all in one query with use of more complicated Custom Columns?  Sure.  But again, would it be maintainable by people newly introduced to Power Query?  Hopefully intellisense in the Custom Column Dialog will change my answer, but right now I’d say that’s unlikely.

One of the things I love about Power Query is the ability to break this down into separate queries, then put them back together.  The reality is that this series of data was mixed, needing slightly different treatment in each case.  This setup allowed me to clearly focus on the goal, getting it done without noise from the other type of data in the set, then just append them afterwards.  I call that a win as it’s easily auditable by someone closer to the beginner side of the Power Query learning curve.

# Power Query Challenge 5

It’s been a while since our last challenge, so hopefully you’re ready for Power Query Challenge 5!  This is one that came up as I was teaching one of our 3 day Excel BI Bootcamp courses for a client, (similar to this public course!) which actually made it doubly tricky.  Why?  Because I needed the person to be able to both understand how I approached the task, as well as be able to maintain it after I left.  I didn’t want to just kick up some magic voodoo and walk away, as that isn’t really fair.

# The issue for Power Query Challenge 5

The challenge in this case was to allocate a fixed amount for a series of account IDs.  They provided a starting ID, an ending ID, and a value.  Sounds easy so far right?  Here’s the problem… the key part of some of those IDs were purely numeric, but the key part of others were alphanumeric!  An example of the source data:

Can you make a series from one to the other?

And the required output:

Here is the data we need to create

The rules – as they were explained to me – were as follows:

• The first 7 digits won’t change for the series
• If there is no “To” value, then we only need the single value
• If the final character is text, it will never exceed Z. (i.e. we’ll never go from 10A to 11C)

# How would you solve Power Query Challenge 5?

My first thought was that I’d have to reach to a custom function to do this, but again, I didn’t feel that was fair to my audience with only a day of Power Query training under their belt.  (With Dimensional Modeling and DAX to cover in our workshop, we don’t have time to take people deep into custom functions.)  After playing with it for a bit though, I was able to work past that problem and come up with a UI driven solution which works.  (Having said that, how you decide to solve this is really up to you!)

Now last time, I posted my solution the same day, along with my steps.  The result was that we had less engagement, possibly because that gave too much away.  So this time, I’ve created the thread for you to post your solutions in the forum, but I’m going to hold off on posting my approach until tomorrow.

Good luck!

# Self Service BI with Excel and Power BI

This week Ken is at the Microsoft Business Applications Summit in Atlanta. However, we're super excited that he'll be back on home turf in July for our next public training session. The 3-day Self Service BI Boot Camp in Vancouver will change your company's reporting game forever!

## Is Self Service BI a Double-Edged Sword?

Matthew Roche, a member of Microsoft's Power BI CAT Team and noted sword enthusiast, recently posted about a trend he's seeing. His global customers often describe self service BI as a "double-edged sword." After thinking about this comparison, he felt the simile held up. But perhaps not in the way you think!

In the post on his blog, Matthew summed it up nicely. Having two sharp edges is only dangerous to the wielder if they don't know how to effectively use each of the distinct and complementary edges together.

## Approaching Self Service BI for the First Time

Perhaps you've been thinking about implementing some self service BI solutions within your organization. And you've been hearing a lot of buzz about Power BI. But you're unsure where to start or how Power BI fits in with your existing data platforms and reporting tools.

Our Self Service BI Boot Camp, with 3 full days of immersive hands-on learning, might be just the place to begin. Guided by the Excelguru, Ken Puls, you'll work with both Excel and Power BI to build some really cool self service BI solutions! We'll not only show you which tool is right for which kind of job, but how to use them together in one solution.

## What will the Boot Camp Cover?

We know that data is not usually stored in nicely curated databases and often - even when it is - the data analyst doesn’t have access to it. Instead, we must piece together data provided in text files, Excel files, web pages, and even the body of emails to build the business intelligence solutions we need. Until now this has been a painful process with a great deal of cut and paste effort, and updates have been tedious and error prone. That stops with this workshop!

In the Self-Service BI Boot Camp we'll dive into:

• a deep exploration of Power Query for gathering, cleaning, and preparing your ugly data
• the benefits, concepts, and key terminology of Dimensional Modeling
• an introduction to DAX,including:
• how to create a variety of DAX measures
• how to control their Filter Context
• understanding how DAX measures are calculated

• calendar intelligence, such as:
• building calendar tables
• using the “Golden Date” pattern
• extending our model to report based on our own year-end
• how to leverage the strengths of Power BI for sharing and reporting
• specific features of Excel and Power BI that every analyst should know, plus:
• which tool to use for which job
• how they can be used together
• recommended best practices for boosting the performance of your self service BI solution

You can read more about the class on the Excelguru website. Likewise, check out the course outline to see what will be covered in each of the 3 days.

## Top 5 Reasons to Attend

1. Our Self Service BI Boot Camp is loaded with hands-on, practical experience. As Ken's friend, Thomas LaRock, recently commented on Twitter, "Telling isn't teaching." Our philosophy is that you need to actually work with a tool in order to really learn it.
2. The small class size provides a more intimate learning environment. Working with a small group allows Ken to interact directly with attendees, answering your questions and fielding discussions as they come up. You get to pose questions and discuss scenarios with one of the world's leading self service BI experts. However, this means that spots are limited, so don't  miss out on your chance to sign up.
3. There's no better time to come to Vancouver than in July - it's a beautiful city all year round, but is in its prime this time of year. That being said, while we can't 100% guarantee great weather, we CAN guarantee you'll get some GREAT training!
4. Early registration pricing is now in effect. You can save \$200 if you sign up before June 24, 2019. Simply go to our website and enter the following coupon code at checkout: SSBIJULY2019.
5. Registration includes some valuable resources to take with you, so you can refer to them later. You'll get a copy of Ken's slides, many of which contain handy recipes to walk you step-by-step through the techniques. Additionally, youreceive copies of all the class example files, including completed versions for later review. These examples are based on real-world scenarios and provide you with techniques that you can being applying to your own data right away.

So what are you waiting for? Come and join us to revolutionize your reporting process!

# Creating Two-Tiered, Multi-Frequency Period Cycles

Sometimes, what should be easy code to write has unexpected pitfalls resulting in opportunities for new learning. I needed to generate a one or two-tiered stream of date values with multi-frequency period cycles, monthly vs annual.

For example, after installing a piece of equipment I wanted to schedule 12 monthly calibrations, beginning with the month after installation, followed by 3 annual calibrations, beginning 6 months after the last monthly calibration.

This is a table of schedule parameters that defines a few sample scenarios:

## Create a Simple List using List.Generate()

My plan was to create generic schedule scenarios, based on specific types of equipment and use period parameters relative to the installation month.

My first thought was to use Power Query’s List.Dates() function to create the sequence of dates, but List.Dates() only increments by combinations of days/hours/minutes/seconds. Not helpful when you need to increment by months. Consequently, I turned to the List.Generate() function to generate the list of periods.

First I wrote some M-Code to test the concept using hard-coded parameters:

Query: JustCreateList
M Code:

And the results were this:

Great! That was easy. If I can make a list and add it to each row of a table, then I can expand each list and build cycle dates.

## Use List Values to Create Dates

Query: BuildDates_HardcodedParams
M-Code:

And the results are:

So far, so good. (I may leave early today.)

## Use Column Values as List.Generate() Parameters

Next, let’s use column values to drive the Lists. What could be easier, right? Here’s one of my many failed attempts:

Query: BuildDates_ColValParams_V1
M-Code:

And the results are:

Wait! What??? Where are my lists? And, what “field access”?

## I Discover “Internal Context” and “External Context”

I won’t go into all of the iterations I tried to get that darn code to work. Suffice it to say that I tried every conceivable variation of “this”, “each”, “_” and anything else I could think of. All to no avail… Same error message.

Now, I could have built a separate function (which I did, and it worked):

Query: fnCreateList
M-Code:

I also could have embedded a function within the M-Code (which I did, and it worked):

Query: SingleTier_EmbeddedFunction
M-Code:

BUT, I wanted a concise, elegant solution. I wanted to just put the List.Generate() function in my code and reference the table columns, just like I would with an Excel function (which I did and…well…you saw the results). I needed help.

So, I posted on Ken’s forum. And who do you think volunteered to assist me? Miguel! (Thank you, Miguel.) Once he established that my interest in referencing column values in List.Generate() had gone way past idle curiosity and quest…all the way to “mission from god”, he gave me a working example and a concise explanation.

## Handling Internal and External Context

Because the List.Generate() function is essentially recursive, it has to look at its own sequential results and test them against its internal limits. It needed explicit code to indicate when it should use external context (table references) and when it should use internal context (intermediate function results). I won’t pretend that I understand the “why” of the required syntax, but I quickly picked up on the “how”.

Here’s the working M-Code that uses column values in the List.Generate() function to create the first date sequence:

Query: SingleTier_EmbeddedInnerContext
M-Code:

By simply creating a dummy parameter (listval), the List.Generate() function automatically populated it with its own sequential results and it understood that the column references pointed to the Table! I could have named that parameter “moonchild” or “Sue” or anything else. All that mattered is that it had something to populate.

Now, my results were valid:

Over the first major hurdle and it was a BIG one!

## Combining Lists to Generate Two-Tiered Multi-Frequency Period Cycles

The rest of the solution was relatively easy after that. I needed to allow for a second tier of cycle dates.The basic steps were:

1. If Tier1 begins after the Install Date, make a placeholder for the Install Date… a period zero.
2. Create the Tier1 date sequence.
3. If there’s a Tier2 sequence, create it.
4. Sequentially append the constructed cycle date sequences.

Since that involves M-Code not covered in this blog post, I’ll just post the final solution:

Query: TwoTier_EmbeddedInnerContext
M-Code:

And here are some of the results of my table containing the desired two-tiered, multi-frequency period cycles:

Power Query/Get and Transform has steered my career into new, creative, challenging, and rewarding directions. I’d like to thank Ken for giving me the opportunity to share what I learned about internal/external context and my learning process. I welcome any and all questions and comments.

-Ron Coderre

# Creating a Fiscal Saturday Calendar

A recent question from one of our Power Query Academy registrants was about creating a fiscal Saturday calendar - or a calendar where periods end on the last Saturday of each month.  I cooked up a sample of one way to approach this task, but I'm curious if anyone out there has something better.

# Basic Goal of the Fiscal Saturday Calendar

The basic goal here is to create a full calendar table with three columns:

1. A primary "Date" column which holds a single value for every day of the period
2. A "Fiscal ME" column that holds the fiscal month end based on the final Saturday of the month
3. A "Fiscal YE" column that holds the fiscal year end based on the last Saturday of the year

The trick here is that, unlike a 4-4-5 calendar which has consistent repeating pattern, the number of weeks per month end could shift unpredictably - especially when you take into account leap years.

# Starting with the basic Calendar framework

I started as I usually do with a Calendar table, by following my standard calendar pattern from our Power Query Recipe Card set.  I busted out pattern 60.100 to build both the calendar StartDate and EndDate, leaving me two queries with the appropriate values:

And from there, using recipe card 60.105, I expanded this into a full blown calendar with every day from StartDate to EndDate:

I named this table Calendar-Base and loaded it as a Staging (or Connection only) Query (covered in recipe card 0.110).

The only real thing to note here is that my StartDate and EndDate are the first day of the first fiscal year (a Sunday), and the EndDate is the end of the next year (a Saturday that is 768 days later.)

# Creating Fiscal Month and Year ends for the Fiscal Saturday Calendar

The next step was to create a table that generated the fiscal month ends and fiscal years ends for the calendar.  So I create a new query that referenced the Calendar-Base query.

To ensure it was a fiscal Saturday calendar, the dates needed to be based on the last Saturday of the month and last Saturday of the year.  And in order to work these out, I needed two columns:  DayID and DayOfWeek.  To create these:

• DayID:  I added an Index Column starting from 1. This generates a unique DayID for every row of the table
• DayOfWeek:  I selected the Date column --> Add Column --> Date --> Day of Week

With these in place, I was now set to create the Month End and Year End columns as follows:

• Add Column --> Custom Column
• Name:  Fiscal ME
• Formula:

=if Duration.Days( Duration.From(Date.EndOfMonth([Date]) - [Date])) <7 and [Day of Week] = 6 then [Date] else null

• Add Column --> Custom Column
• Name:  Fiscal ME
• Formula:

=if Number.Mod([DayID],364)=0 then [Date] else null

These formulas flagged the fiscal Saturday calendar periods as shown here:

The final steps to this stage were then to:

• Filter all the nulls out of the FiscalME column
• Remove all but the FiscalME and FiscalYE columns
• Fill the FiscalYE column up
• Set the data types on both columns to Date
• Name the table "Calendar-FiscalPeriods"
• Load as a Staging query (recipe card 0.110 again)

At the end of the process, the calendar clearly shows our fiscal Saturday calendar period ends:

# Finishing the Fiscal Saturday Calendar

The final step is now to put these together. The way I approached this was:

• Create a new query that references the Calendar-Base table
• Merge the Calendar-Fiscal Periods to get an Exact Match between the Date and Fiscal ME columns (recipe card 30.105)
• Expand the Fiscal ME and Fiscal YE columns
• Fill the Fiscal ME and Fiscal YE columns up
• Name the query Calendar
• Load it to the desired destination

Now, to be fair, the calendar only looks like this at this point:

I could certainly add other components.  For a Fiscal Year column, I just need to select Fiscal YE and add a date column.  For months, I'd add a month based on the Fiscal ME column.  And may other patterns can be applied based on the standard date transforms.

And one caveat... the dates fed in must start on the first day of a fiscal, and end on the last day of a fiscal to ensure it works correctly.

# Do you have an easier way?

So here comes the thrust of this... I have easy patterns for standard 12 month calendars, 4-4-5 and their variants and even 13 weeks per year.  But this one, with it's shifting weeks per month threw me off a bit.  I'm curious if anyone has an easier way to generate this which wouldn't rely on splitting this out into separate tables.

# Using Rich Data Types in Power Query

If you’re on Office 365 and don’t have Excel’s new Rich Data Types, you should know that they’ll be coming to you soon.  Giving us the ability to create both Stocks and Geographies, these are going to add some exciting new capabilities to Excel, particularly if we want to enrich our data.  In this post, we'll quickly explore what Rich Data Types are, what they add, and how they are treated by Power Query.

# What is a Rich Data Type?

Have a look at the following data:

The challenge with this data is that it is completely text based.  What if we wanted to enrich this with more information like population, latitude or longitude?  The answer is to convert it to Excel’s new Rich Data Type.  To do this:

• Select the data
• Go to the Data tab -> Data Type -> Geography

This will then convert the text into “Entities” with a little map icon beside them.  And clicking on that little map icon shows some pretty cool new things:

This is the new geography data type. Unlike the original text entry, this object contains all of the properties you see on the card, adding a whole bunch of power to our original data.

NOTE:  The data on this card comes from a variety of sources such as Wikipedia and WeatherTrends360.  Full attribution can be found at the bottom of the card.

# Working with a Rich Data Type

One of the very cool things about this new data type is the ability to expand the enriched data from the object.  To do this:

• Mouse over the top right of the table
• Click the Add Column dialog
• Check the box(es) next to the columns you want to add

Shown below, we’ve extracted Latitude, Longitude, Population and Name.

Note:  This button just writes the formulas needed to extract the data from the Rich Data Type. We could have easily written formulas to do this ourselves, such as =@Location].Latitude or =A4.Latitude.

The impacts of this should be pretty clear… even though we started with text, we now have the ability to convert it into a real place and pull further data back from that area!

# Rich Data Types and Power Query

The ability to enrich a plain text data source is huge.  One simple example of their impact is that we could add the Lat/Long coordinates to allow proper mapping in Power BI. But how will Power Query read these new Rich Data Types?  Not well as it turns out…

Ideally, Power Query would pull in this data and recognize it as a proper record, which would allow you to extract the elements.  And while I’m sure that will happen one day, it won’t be possible when Rich Data Types hit your build of Excel.

The trick to getting at this data today is actually already evident in the image above: create new columns in the original table.  Even though Power Query (in Excel or Power BI) can’t read the Rich Data Type itself, it CAN read the columns you extract via formulas.  It’s a workaround, and one we’d prefer not to have to do, but at least we can get to the enriched data that these new data types give us.