Creating a Fiscal Saturday Calendar

A recent question from one of our Power Query Academy registrants was about creating a fiscal Saturday calendar - or a calendar where periods end on the last Saturday of each month.  I cooked up a sample of one way to approach this task, but I'm curious if anyone out there has something better.

Basic Goal of the Fiscal Saturday Calendar

The basic goal here is to create a full calendar table with three columns:

  1. A primary "Date" column which holds a single value for every day of the period
  2. A "Fiscal ME" column that holds the fiscal month end based on the final Saturday of the month
  3. A "Fiscal YE" column that holds the fiscal year end based on the last Saturday of the year

The trick here is that, unlike a 4-4-5 calendar which has consistent repeating pattern, the number of weeks per month end could shift unpredictably - especially when you take into account leap years.

Starting with the basic Calendar framework

I started as I usually do with a Calendar table, by following my standard calendar pattern from our Power Query Recipe Card set.  I busted out pattern 60.100 to build both the calendar StartDate and EndDate, leaving me two queries with the appropriate values:

And from there, using recipe card 60.105, I expanded this into a full blown calendar with every day from StartDate to EndDate:

I named this table Calendar-Base and loaded it as a Staging (or Connection only) Query (covered in recipe card 0.110).

The only real thing to note here is that my StartDate and EndDate are the first day of the first fiscal year (a Sunday), and the EndDate is the end of the next year (a Saturday that is 768 days later.)

Creating Fiscal Month and Year ends for the Fiscal Saturday Calendar

The next step was to create a table that generated the fiscal month ends and fiscal years ends for the calendar.  So I create a new query that referenced the Calendar-Base query.

To ensure it was a fiscal Saturday calendar, the dates needed to be based on the last Saturday of the month and last Saturday of the year.  And in order to work these out, I needed two columns:  DayID and DayOfWeek.  To create these:

  • DayID:  I added an Index Column starting from 1. This generates a unique DayID for every row of the table
  • DayOfWeek:  I selected the Date column --> Add Column --> Date --> Day of Week

With these in place, I was now set to create the Month End and Year End columns as follows:

  • Add Column --> Custom Column
    • Name:  Fiscal ME
    • Formula:

=if Duration.Days( Duration.From(Date.EndOfMonth([Date]) - [Date])) <7 and [Day of Week] = 6 then [Date] else null

  • Add Column --> Custom Column
    • Name:  Fiscal ME
    • Formula:

=if Number.Mod([DayID],364)=0 then [Date] else null

These formulas flagged the fiscal Saturday calendar periods as shown here:

The final steps to this stage were then to:

  • Filter all the nulls out of the FiscalME column
  • Remove all but the FiscalME and FiscalYE columns
  • Fill the FiscalYE column up
  • Set the data types on both columns to Date
  • Name the table "Calendar-FiscalPeriods"
  • Load as a Staging query (recipe card 0.110 again)

At the end of the process, the calendar clearly shows our fiscal Saturday calendar period ends:

Finishing the Fiscal Saturday Calendar

The final step is now to put these together. The way I approached this was:

  • Create a new query that references the Calendar-Base table
  • Merge the Calendar-Fiscal Periods to get an Exact Match between the Date and Fiscal ME columns (recipe card 30.105)
  • Expand the Fiscal ME and Fiscal YE columns
  • Fill the Fiscal ME and Fiscal YE columns up
  • Name the query Calendar
  • Load it to the desired destination

Now, to be fair, the calendar only looks like this at this point:

I could certainly add other components.  For a Fiscal Year column, I just need to select Fiscal YE and add a date column.  For months, I'd add a month based on the Fiscal ME column.  And may other patterns can be applied based on the standard date transforms.

And one caveat... the dates fed in must start on the first day of a fiscal, and end on the last day of a fiscal to ensure it works correctly.

My sample file can be found here.

Do you have an easier way?

So here comes the thrust of this... I have easy patterns for standard 12 month calendars, 4-4-5 and their variants and even 13 weeks per year.  But this one, with it's shifting weeks per month threw me off a bit.  I'm curious if anyone has an easier way to generate this which wouldn't rely on splitting this out into separate tables.

Using Rich Data Types in Power Query

If you’re on Office 365 and don’t have Excel’s new Rich Data Types, you should know that they’ll be coming to you soon.  Giving us the ability to create both Stocks and Geographies, these are going to add some exciting new capabilities to Excel, particularly if we want to enrich our data.  In this post, we'll quickly explore what Rich Data Types are, what they add, and how they are treated by Power Query.

What is a Rich Data Type?

Have a look at the following data:

Table of locations for experimenting with Rich Data Types

The challenge with this data is that it is completely text based.  What if we wanted to enrich this with more information like population, latitude or longitude?  The answer is to convert it to Excel’s new Rich Data Type.  To do this:

  • Select the data
  • Go to the Data tab -> Data Type -> Geography

This will then convert the text into “Entities” with a little map icon beside them.  And clicking on that little map icon shows some pretty cool new things:

Example of the Geography Data Type

This is the new geography data type. Unlike the original text entry, this object contains all of the properties you see on the card, adding a whole bunch of power to our original data.

NOTE:  The data on this card comes from a variety of sources such as Wikipedia and WeatherTrends360.  Full attribution can be found at the bottom of the card.

Working with a Rich Data Type

One of the very cool things about this new data type is the ability to expand the enriched data from the object.  To do this:

  • Mouse over the top right of the table
  • Click the Add Column dialog
  • Check the box(es) next to the columns you want to add

Adding enriched data to the Location table

Shown below, we’ve extracted Latitude, Longitude, Population and Name.

The Location table with enriched data added

Note:  This button just writes the formulas needed to extract the data from the Rich Data Type. We could have easily written formulas to do this ourselves, such as =@Location].Latitude or =A4.Latitude.

The impacts of this should be pretty clear… even though we started with text, we now have the ability to convert it into a real place and pull further data back from that area!

Rich Data Types and Power Query

The ability to enrich a plain text data source is huge.  One simple example of their impact is that we could add the Lat/Long coordinates to allow proper mapping in Power BI. But how will Power Query read these new Rich Data Types?  Not well as it turns out…

The enriched Locations table has been brought into Power Query but creates an error

Ideally, Power Query would pull in this data and recognize it as a proper record, which would allow you to extract the elements.  And while I’m sure that will happen one day, it won’t be possible when Rich Data Types hit your build of Excel.

The trick to getting at this data today is actually already evident in the image above: create new columns in the original table.  Even though Power Query (in Excel or Power BI) can’t read the Rich Data Type itself, it CAN read the columns you extract via formulas.  It’s a workaround, and one we’d prefer not to have to do, but at least we can get to the enriched data that these new data types give us.

Task Tracking with Power Query

Did you know Power Query can be used as a task tracking tool? This might sound quite unusual but the method described here has been used for solving a real business case. The example I will use is rather simplified but still close to reality, and will demonstrate how to build task tracking with Power Query.

Laying out the Scenario

Vicky is a manager of a small team that is dealing with customer questions on various topics. One of her duties is to distribute various questions among her subordinates. After that, each of them should take some actions and report what is the status of each task.

The problem is – how can each employee can see what tasks are assigned to him/her and fill in the respective information for each task? At the same time, Vicky should at any moment be able to assign a new task and review the statuses of old ones. This is the table Vicky needs:
Task Tracking with Power Query

Unfortunately, she has no other tool at hand except Excel. Luckily, she can set up task tracking with Power Query right in Excel, which could work perfectly in this case.

Setting up Task Tracking with Power Query

So let's start building the solution.

1. Load the left table (in this example, called Filled by Manager) into Power Query.
Manager's table to assign tasks

2. Next, create one query for each employee by filtering the Employee column.
Create individual employee queries

3. Load each Employee table into a separate Excel sheet. (Of course they can be on different files linked to the source table).
Sample employee table showing tasks assigned

4. Then, create a table for each employee to fill in the actions and statuses.
Employee's task tracking worksheet

You can see in the above picture what each employee will have in his/her worksheet - a green table on the left with the tasks assigned to them, and a yellow table on the right where he/she has to fill in the respective information.

Creating the Filled By Employees Table

5. Load all the Employee tables into Power Query.
Load all the employee tables into Power Query

6. Append them in a new query (in this example, called Statuses).
Append all the employee tables into new Statuses query

You are probably guessing what the next step is – load the Statuses query into Excel right next to the Filled By Manager table

However, the result is not what we would expect.
The Filled by Manager table is not matching the newly loaded Fill by Employees table

Note that on first row of the Manager’s table is a task assigned to Ivan on 27.01.2019, but row 1 of the Employee’s table shows the task assigned to Maria on 09.02.2019.

In order to fix this mess, we need one additional query.

Building the Task Code Query

7. Once again, load the Manager’s table into Power Query and remove all columns except for Task Code.
Task Code Column

Task Code is a unique identifier of each task. In this example, it is simply composed of the employee's name and the number of occurrences of this name in the register up to the respective row. In Excel language, the formula is:
Use COUNTIIF to create unique task identifiers

The trick is that we fix the first row of column F (containing the employees' names) but the end row is not fixed.

8. Merge the Register Employees and Statuses queries together.
Merge the Register Employees and Statuses tables

9. Finally, expand the table and voila - it is in the required order. The only thing left is to load it back into the Manager’s table.
Final table for task tracking with Power Query

Now, any time she needs to, Vicky can refresh the Filled by Employees table and see the updated statuses of each task.

Likewise, each one of her subordinates can simply refresh the Manager’s table (the green one that is on left of his/her tab) to see any new tasks that have been assigned.

You could also automate the refresh operation VBA. For more details, refer to Chapter 16 of Ken's M is for (Data) Monkey book.

Final Words

This article presents nothing new and unusual as a Power Query technique. What is new and unusual is the way Power Query has been used for solving a typical business problem. This is just additional proof of how powerful and useful this tool is.

You can find the file with example here: Task tracking with PQ

Check the application version in Modern Office

In the good old days, it was easy to check the application version in Office with VBA.  You just used a little test of Val(Application.Version) to return the number.  12 was Office 2007, 14 was Office 2010, 15 was Office 2013, and 16 was Office 2016.  But then Office 365 came out, and 2019, and things fell apart.

Conducting a check of the application version in Modern Office is not as straight forward.  From Office 2016 onwards, Microsoft has not revved the Application.Version number - they all show as 16.0 - giving you no way to differentiate between versions.  (Bastien discusses this in a blog post a few months ago.) But worse, while he focuses on 2016 vs 2019, there is also no way to test between these and Office 365 subscription versions.  As there are now things that work differently for Office 365 than the perpetual licenses, this is another potential problem for developers.

This past week I ran into a scenario where I needed to do exactly this.  I needed to find a way to programatically enumerate whether a user is running Office 2016, Office 2019 or Office 365, as I had to do something different in each case.

So how can we check the application version in modern Office?

After doing a little digging, I finally found a registry key that seems to appear in Office 2019 and Office 365, but does not exist in Office 2016.  So that was good news. And even better, that key holds values like "O365ProPlusRetail" vs "Office2019ProfessionalPlus".  While I haven't tested with other SKUs, this would seem to indicate a pattern I hope we can rely on.

Given that, I've pulled together this function.  It's purpose is fairly simple: Test the application and see if it is a perpetual license or a subscription install, and return the version number.  So anyone with Office 365 installed should receive 365 as a return, otherwise you'll get a four digit number representing the version you have installed.

Function to check the application version in Modern Office

Function AppVersion() As Long
'Test the Office application version
'Written by Ken Puls (www.excelguru.ca)

Dim registryObject As Object
Dim rootDirectory As String
Dim keyPath As String
Dim arrEntryNames As Variant
Dim arrValueTypes As Variant
Dim x As Long

Select Case Val(Application.Version)

Case Is = 16
'Check for existence of Licensing key
keyPath = "Software\Microsoft\Office\" & CStr(Application.Version) & "\Common\Licensing\LicensingNext"
rootDirectory = "."
Set registryObject = GetObject("winmgmts:{impersonationLevel=impersonate}!\\" & rootDirectory & "\root\default:StdRegProv")
registryObject.EnumValues &H80000001, keyPath, arrEntryNames, arrValueTypes

On Error GoTo ErrorExit
For x = 0 To UBound(arrEntryNames)
If InStr(arrEntryNames(x), "365") > 0 Then
AppVersion = 365
Exit Function
End If
If InStr(arrEntryNames(x), "2019") > 0 Then
AppVersion = 2019
Exit Function
End If
Next x

Case Is = 15
AppVersion = 2013
Case Is = 14
AppVersion = 2010
Case Is = 12
AppVersion = 2007
Case Else
'Too old to bother with
AppVersion = 0
End Select

Exit Function

ErrorExit:
'Version 16, but no licensing key. Must be Office 2016
AppVersion = 2016

End Function

If you'd prefer to just download a workbook with the code in it, here you go.

Care to help me test it?

I'd love it if people could give this a try and see if it returns correctly based on the versions of Excel you're running, particularly if you have a flavor of Office 365 or Excel 2019.

Let me know how it goes!

EDIT:  I have made a small change to the code and sample file in case "O365" is not at the beginning of the registry key.  This should pick it up no matter where in the key the 365 term shows up.  I am starting to wonder if this key is only present for Insiders.  So if you do test, please let us know what channel you are on in addition to whether or not it works!

Analyzing text with Power Query and Excel

We all know how good Power Query is for cleaning, shaping and analyzing numerical data. But this powerful tool opens doors for analysts that had been closed for long time, such as for analyzing text. In this article, I will give you some ideas how Power Query can be used for analyzing text that is unstructured. This is a new area for me as well so I’ll be grateful for any comments, thoughts, ideas and shared experience that will help to elaborate further on the methodology.

Our scenario

First, let’s put things in a context with simple but realistic example. Assume you are the newly appointed Marketing Manager of a 4-star SPA hotel. You want to analyze reviews about the hotel in sites like booking.com and tripadvisor.com to understand the major service issues. Thanks to Power Query, you will be able to extract, clean and shape data from those sites to receive a nice table like this:

Comments table

* Table contains ~300 real opinions for several Bulgarian SPA hotels in Hissarya, Velingrad, Sandanski and Bansko. Source: booking.com and tripadvisor.com

But how can you get some insights from this data? Obviously you will spend lots of time to read all reviews. Even if you do that, you need to summarize the information somehow. So let’s play a little within Power Query and see how it can help us analyze text.

Preparing data for analysis

First, we load the table in Power Query (From Table) and clean it of punctuation symbols like commas, semicolons, dashes, brackets, etc., as well as replace capital letters with lower cased one. Why do that last one? Because we want to count the most frequently used words and for Power Query ‘hotel’, ‘hotel,’, ‘Hotel’ are different words.

The cleaning can easily be done with tools like Replace Values, Trim, and Lowercase. I believe they are well-known to average user of Power Query so will I skip detailed operations. Instead, here some lessons learnt from practice:

  • First, duplicate the Opinions column so you can have original text.
  • Then, replace all punctuation symbols with a space.
    • Later we will use a period a separator to split opinions into sentences. And since a sentence can finish with exclamation mark or a question mark it is useful to replace ‘!’ and ‘?’ with ‘.’
  • Since this is a hotel, clients may use the symbol ‘*’ in their opinions (i.e. ‘nice 4* hotel’ or ‘definitely not a 4* hotel’). So don’t replace this symbol.

Above points are nice examples that when analyzing text or other data you need to do it in a context and should explore it first, but that is another story.

Here are the steps applied so far. As a final touch the text has been trimmed and cleaned.

Steps taken to clean data

Separating each comment

Our next step is to split the column using a period ('.') as a delimiter, and doing so at each occurrence of the delimiter.

Split columns by delimiter

Power Query creates a number of new columns, each one containing a sentence. I bet you are guessing what will follow. That’s right – select the original two columns and choose Unpivot Other Columns.

Unpivot other columns

After removing the Attribute column and filtering out rows that are blank, our table has a column for the original comments (Opinion) and a column for each sentence contained in the original comments (Sentence). Please be patient here, and you will understand why this is important.

Table containing full comments and each individual sentence

Let’s now duplicate the Sentence columns and repeat the above actions: split each sentence into its individual words, then unpivot. Below is the table after removing the Attribute column and filtering out blank rows:

Table containing full comments, each individual sentence, and each word

As a next step I will add custom column with a simple formula that calculates the length of each word. We can use it to remove common words that bring no value to our analysis – ‘a’, ‘the’, ‘is’, ‘or’, ‘but’ etc.

Custom function to calculate the length of each word

Finally, filter to exclude words that are 1, 2 or 3 letters long. We will use the resulting table as the basis for analyzing text.

Table filtered for words greater than 3 letters

Finding Lookup words

As a first step in analyzing text, let’s Duplicate the query (you may use Reference as well).

Duplicating the query

In the new Query, simply use the Group By function on the Words column to count the rows.

Grouping by the Word column

Thus, we will have a list with all words (above 3 letters long) and how many times they have been used. For convenience, we can sort this and then load it into Excel.

Word count table

Let’s take a look at the list. Obviously it contains words like ‘with’, ‘this’, ‘have’ that we need to ignore because they could be used in many different situations. On the other hand we see ‘good’ and ‘nice’ in the top 20. Those words are highly subjective – what is good for me might not be good for you. Besides, we don’t know whether the reviewer said ‘good food and nice service’ or ‘the food was not good and service far from nice’. So let’s skip them for now.

Apart from the above, we see that customers mention most often words like ‘room’, ‘pool’, staff’, ‘food’. These are our words of interest (I call them ‘Lookup words’). We may now filter all sentences that contain these words, allowing us to read much less information compared to original list of opinions.

Honestly speaking, if you have enough experience or even some common sense, you can skip this step. I mean it is more than expected that ‘room’, ‘staff’ and ‘pool’ are most often mentioned in reviews for a SPA hotel. However, identifying lookup words might be needed for cases like analyzing text of more open-ended questions.

Filtering Lookup words

So far we have identified our words of interest and listed them in a table:

List of lookup words

So how do we filter all sentences containing these lookup words? We can make it dynamic by creating a custom function. The process is similar to one used to load data from multiple internet pages, as described in M is for (Data) Monkey.

First, we create a query to Filter Rows of the Word column that contain one of our lookup words, ‘staff’ for example.

Filtering for a look up word

Then we go to Home > Advance Editor and change the M code a bit as shown below:

Updating the M code

After we create the function, we rename it as something more user-friendly (I used fxFilter).

Then we load our LookupWords table into Power Query (From Table):

LookupWords table

Next, we Add Custom Column. (Note: in Excel 2016, you have to Invoke Custom Function.)

Adding a custom column

And when we expand the table, we get all rows that contain the words from our lookup list.

Identifying Connected words

We now need to split the Sentence column again to identify the words that were most often used in sentences containing our Lookup words. I call these words ‘Connected words’.

There are some final steps such as eliminating words that appear rarely (i.e. less than 5 times), removing duplicate rows, rearranging columns etc. that I will skip.

Our final table is on the left below:

Final table for analyzing the comments

Analyzing text

As you see, I added two slicers to our table - one for our Lookup words and one for our Connected words. By using them in combination, we can get more insights quite easily. For example, these are the words used most frequently together with ‘staff’:

Slicers used to identify comments regarding staff

Here, I have filtered for the Connected word ‘friendly’, which gives us just 10 rows to read:

Analyzing text for the word 'friendly'

In a separate column, we can manually type “Y” for each positive opinion and “N” for each negative. Now we can calculate that 30% of mentions are actually negative. This is a simple way to quantify and measure our data. And it takes just few minutes to find out that the picture is similar with ‘helpful’ and ‘polite’ – positive feedback is prevailing.

When we filter ‘English’ it is easy to see that 8 people mentioned staff speaks no or poor English. That directly calls for some action - we need to train our staff in English. If you were reading through 300 opinions, you might miss this quite easily.

Analyzing text containing 'staff' and 'English'

It takes just few clicks and couple of minutes or reading to find out that clients are happy with the breakfast:

Analyzing text for comments on breakfast

But are disappointed by the prices and quality of food in the restaurant:

Analyzing text for comments on the restaurant

Of course this is just a starting point and each of the above observations needs further investigation and analysis. But the key takeaway here is that we can successfully use Power Query to summarize unstructured text data. We can then focus our attention on sentences that contain the most commonly used words instead of reading the whole bunch of reviews.

Applying the technique

Many other ideas come up into my mind of how we can use this for further analyzing text data. We can use the Stayed column to study trends in time, we can quantify some key words (as shown with 'friendly') or make a word cloud showing our top 20 words. What are your ideas - how can we use Power Query and Excel to analyse unstructured text?

Nuthin’ ain’t nuthin’ in Power Query

There are two kinds of nuthin' in Power Query: null, and blank. I tripped on this issue the other day, and Ken thought it would be a good idea for a blog post.

Let's just call out the two types of nuthin' in Power Query:

  • null is literally "no value" for any data type from text to table.  In other words, the cell is completely empty.
  • A blank also has looks like "no value", but it is equivalent to a cell formula of  ="" in Excel.  In other words, the cell holds a value that renders as blank.

Why is this important?  It's because, inside Power Query (and indeed many programming languages) null and blank are not equal!  And it turns out that nuthin' matters more than getting the right nuthin' in Power Query!

Burned by nuthin' in Power Query

Some time ago I built a set of Excel Power Query transforms which report on data extracted from a client system. For some time the client data has been extracted into Excel files, but there were some problems. My solution was to extract the client data files as CSVs instead.

What I found, unexpectedly, was that blank data values are treated differently by the PQ import functions depending on whether the file being imported is an Excel file or a CSV file!

Here's the rules:

  • For an Excel import, blanks are converted to null - always.
  • For CSVs, blanks are imported as blanks (not nulls). But when a field type is changed in a Power Query step the numeric and date column blanks are converted to nulls, and text column blanks remain as blanks (not nulls)

Seeing nuthin' in Power Query

What does it matter?  Have a look at the following cases...

1. Excel data with blanks loaded into PQ. Blank cells are imported as null.

2. CSV data with blanks imported to Power Query: Blanks are read as blanks.

3. CSV data with blanks: dates and numbers change to null after type change. Text blanks remain blank

The impacts of nuthin' in Power Query

Nuthin' in Power Query could have caused me more issues here... my transforms used conditional columns to check for nulls in text columns, a test which failed when the value is blank.

As Ken pointed out, another key issue is that the Fill Down and Fill Up functions in Power Query are used to fill null values. But blanks are not nulls, so the functions do not work as intended in my case.

How to deal? Caveat Emptor! It appears Power Query treats blank data differently in the CSV import and Excel import functions.

So, if you are building transforms based on a consistent source of imported data, then there is little impact. But if you should need to change the type of data source like I did then beware. I had to do some detail testing, and I was looking to re-write some longstanding and well tested transforms.

Instead, for my specific situation, I added a value change step to the CSV import in PQ to replace nuthin' (blank) with the keyword null for the whole data set. That allowed me to maintain my downstream logic, but cost me some processing speed.

I hope this alerts PQ users to a potential issue in their transforms. If there are any alternate solutions to the problem let's see them in the comments.

Power Query Recipes Now Available

As you may have heard, we have been working on a set of helpful Power Query Recipes for Excel and Power BI and are excited to announce that they are now officially available!

Are our Power Query Recipes for you?

The Power Query Recipes are targeted at people who are familiar with the Power Query interface (in either Excel or Power BI), and will lead you step-by-step through the process needed to clean up and convert your data from one format into another.  And if you're not already comfortable with Power Query?  Consider joining the Power Query Academy so we can change that!

So what is in the Power Query Recipes package?

There are currently over 30 cards in the set, showing easy-to-follow steps that will deal with a variety of common data issues.  I personally am finding them super useful, and often refer back to them when I'm helping people clean up their data in person or in forums.

We've even marked each of the Power Query Recipes that has a video version in Power Query Academy.  If you are already subscribed to our Academy, simply click the video camera in the bottom corner, and it will take you straight to the appropriate video so that you can see the technique demonstrated in a live setting.

Are there samples of the Power Query Recipes?

Of course there are!

Here's an example of one of my favorites, which lays out how to create Full Anti Join, something that does not exist in the regular Power Query user interface:

image

And another which shows how to create a Calendar Table on the fly:

image

We actually have a sample package available which includes four recipe cards (including the precedent card for the calendar recipe shown above), as well as the full table of contents for the current version.

Just some of the patterns included in the full version are:

  • Splitting data into new columns or rows
  • Pivoting, unpivoting, and transposing data
  • Several different ways to merge tables (including the Full Anti Join shown above)
  • Extracting a new column based on values in the prior or next rows
  • Six different ranking methods
  • Creating dynamic calendar tables
  • Adding a random number to all rows

Will there be more Power Query Recipes?

Oh yes!  While there are already over 30 recipes in the set, we already have another 20 on our "to-do" list, and are adding to that list regularly.  Our intention is to release new recipes on a regular basis, putting even more Power Query resources at your fingertips.

How do I get my copy of the Power Query Recipes?

There are a few different options:

  1. If you are a Power Query Academy member*, the recipes are already included in your registration.
  1. Pre-order a copy of the new 2nd edition of our M is for Data Monkey book from Amazon by September 30, 2018 and receive a FREE set of the initial  recipes.**
  1. Purchase the current stand-alone set of recipes from the Power Query Training web store for $24.95 USD. This is also an option to add a subscription ($2.95 USD every 3 months) so that you get all the new recipe cards as we expand the set.

* Please note that the recipe cards are not included in the free Power Query Academy trial, only with the full course.
**Upon receipt of your Amazon proof of purchase being received via email at support@powerquery.training, we will provide a coupon code for $24.95 which can be applied to either the stand-alone or subscription purchase.  Act fast though, as this offer ends on October 31, 2018.

Power Query Challenge 3 Results

Wow… 46 submissions for Power Query Challenge 3!  Crazy stuff. So many that I haven't even had time to read them all yet!

Due to the overwhelming responses, here's how I'm going to handle this:

  1. Show you how I approached this challenge
  2. Call out a couple of the submissions that I thought were cool and/or interesting
  3. Leave it to you to throw a note in the comments if you think that your (or someone else's) submission should have been listed.

You can find access to all of the submitted solutions (including mine) stored on my OneDrive account.

My solution to Power Query Challenge 3

Step 1 of Challenge 3: Assigning scores to letters

The most important component to Challenge 3 is to have a table that assigns a value to each letter, and there are a couple of ways you could do this.  I elected to do this via the following method:

  • Create a new blank query and name it LetterValue
  • Enter the following formula in the formula bar:
    • = {"A".."Z"}

This creates a nice list of letters from capital A through capital Z:

SNAGHTML99b7af

As great as this is, I still need to add values to the numbers (1 for A, 2 for B, etc..).  It's easy to do with an Index column, but the problem is that the button to do so is not available when you're working with a list.  No problem though, we just convert it to a table first:

  • Go to List Tools --> Transform --> To Table --> OK
  • Go to Add Column --> Index Column --> From 1
  • Rename the columns to Letter and Value
  • Set the Data Types to Text and Value

And that's it.  The query (as shown below) can now be loaded as a connection only for later use:

SNAGHTML9d4b74

Other options for Step 1 of Challenge 3

To be fair, I'm pretty comfortable knocking out a quick list, as I do it all the time for Calendar tables.  But as Bill Syzyz pointed out, you get bonus points for this being 100% UI driven.  Could I have done that?  Of course!  Instead of creating the list, you would:

  • Create an Excel table that holds the letters and values
  • Pull it into Power Query
  • Right click the Letter column --> Transform --> Upper Case
  • Right click the Letter column --> Remove Duplicates
  • Load it as connection only

Is it easier?  Maybe.  It it more robust?  It could be more robust if you also want to score other characters.

Step 2 of Challenge 3: Scoring Words

So now we get into the money part of Challenge 3 - scoring the entire word.  Let's assume that we have a nice little Excel table which stores all the words in a column called "Word" like this:

image

(To be fair, the data could come from a database or anywhere else, the source is really incidental to the problem.)

To score these words we can

  • Pull the data into Power Query
  • Right click the [Word] column --> Duplicate
  • Right click the [Word - Copy] column --> Transform --> UPPERCASE

This leaves us here:

image

Now the trick…(psst… I have a cool pattern card for this… watch this space in the next few days for news!)

  • Right click [Word - Copy] --> Split Column --> By Number of Characters
    • Choose to split by 1 character, repeatedly
    • From the Advanced Options, choose Rows

Your output should now look like this:

SNAGHTMLaa6bfd

Next:

  • Go to Home --> Merge Queries
  • Choose LetterValue and merge [Word - Copy] against [Letter]
  • Expand only the [Value] column from the [LetterValue] column (by clicking the expand icon at the top right of the column)

SNAGHTMLad0db2

The data is out of order now, but it doesn't matter.  It's time to put the finishing touches on Challenge 3…

  • To to Transform --> Group By
  • Configure a basic grouping like this:
    • Group by Word
    • Create a Score column that SUMs the [Value] column

image

And you're done!

image

You can now throw any value into the Excel table, hit refresh, and it will score the words…

image

Well how about that…

And our upcoming Power Query Recipe cards look like they could be way better than hard work too!  (Stay tuned for their release next week!)

Your Challenge 3 Submissions

With 46 submissions for Power Query Challenge 3, it was impossible to go through them all.  I'm hoping that you'll consider being part of the community of reviewers here and check out a few random ones, then post in the comments any that you think I should have mentioned.  I can tell you that in the first four submissions alone there are three different versions of custom functions, and a UI driven approach.

But there are a couple of submissions I looked at that I did want to highlight, as they brought more than just a query to the game.  Smile

Bart Titulaer

Bart's solution includes not only his Power Query work (which he tried more than one way), but he also thought it might be fun to include some frequency distributions with the solution:

image

It's a good reminder that the end goal of Power Query isn't just to clean up data, it's to actually use it.  (Something I probably don't cover enough on this blog!)

Kolyu Minevski

Kolyu decided to compare scoring English vs Bulgarian, and even summed it up for me with a note:

image

Nice to know since I'll be back in Sofia at Bulgaria Excel days on November 1!

Power Query Challenge 3

**Please note that the challenge is now closed, so we are no longer accepting submissions. However, you are still welcome to try it out on your own.

I've got a lot of feedback that you enjoy the Power Query Challenge series we've been running, so it's time for another!  Challenge 3 is just a fun one that was inspired by a conversation I had with Alex J.

Background for Challenge 3

I'm sure you've all seen this before:

If the letters A-Z are worth a value of 1-26 respectively, then:

  • Knowledge = 96%
  • Hardwork = 98%

Of course, the joke is that Attitude is worth 100%.

Your job for Challenge 3

For challenge 3, based on assigning a letter score of 1 for A, 2 for B, 3 for C, etc…:

  • Take a table of words
  • Return the total score using Power Query

So basically… do this:

image

But keep in mind that correctly solving Power Query Challenge 3 requires one very important thing… It needs to work no matter the case of the letters in the original column.

Before you submit your answers to Power Query Challenge 3

After our previous challenges, I got feedback that people really want to see the solutions submitted by others.  I haven't shared them only because I didn't specifically say I was going to, and I can't guarantee that there isn't personally identifiable information in them.  That will change here…

Submitting Your Answer to the Power Query Challenge

**The challenge is now closed, so we are no longer accepting submissions.

To submit your answer:

  • Please name your file using your name - or the name alter-ego if you prefer.  (Keep in mind that your real name could be in the file properties, and it's up to you to clean that out.)  The reason we need a name is so that anyone reading solutions can ask questions about your solution.
  • We have to cap this somewhere, so will allow submissions until the end of Sunday, Sep 16 only.
  • While the challenge is open, you can submit your answer to Rebekah at Excelguru dot ca with the subject Power Query Challenge 3.

After Submissions are closed:

I will post my solution to Power Query Challenge 3, and share a link to a folder of all the submitted solutions.

Please note that last challenge we had over 30 submissions.  While I may post some highlights from the solutions, I won't be doing a full write-up of all them due to the volume we expect to see.

Have fun!

Power Query Challenge #2 Results

What an overwhelming response to Power Query Challenge #2!  We had 40 submissions, and some with multiple entries in a single submission.  Plainly you all enjoyed this!

Naturally, there were a couple of submissions that involved custom functions, and a couple who wrote manual grouping functions to get things done.  These folks obviously know how M works, so I'm going to focus more on the other entries to show different UI-driven routes to accomplish the goal.  Winking smile  Each of those is included in the workbook that you can download here.

The Base Query

I'm going to start this by creating a base query called "Source Data" which has only 2 steps:

  • Connect to the Data Table
  • Set the data types

This is going to make it easy to demo certain things, but also replicates what a lot of you did anyway.

Most Popular Solutions to Power Query Challenge #2

By far the most popular solution to Power Query Challenge #2 was by starting using one of the following two methods:

Method 1A - Group & Merge

  • Reference the Source Data query
  • Merge Customer & Membership
  • Remove duplicates on the merged column
  • Group by the Customer column and add a Count of Rows

Method 1B - Group & Merge

  • Reference the Source Data query
  • Remove all columns except Customer & Membership
  • Group by the Customer column and add a Count of Distinct Rows

Either of these methods would leave you with something similar to this:

image

Method 1 Completion

No matter which way you sliced the first part, you would then do this to finish it off:

  • Filter the Count column to values greater than 1
  • Merge the filtered table against the original data set:
    • Matching the Customer column
    • Using an Inner join
  • Remove all columns except the new column of tables

image

  • Expand all columns
  • Set the data type of the Data column and you're good

image

Of the 34 entries, this variation showed up in at least 25 of them.  Sometimes it was all done in a single query (referencing prior steps), sometimes in 3 queries, and sometimes it wasn't quite as efficiently done, but ultimately this was the main approach.

A Unique Solution to Power Query Challenge #2

I only had one person submit this solution to Power Query Challenge #2.  Given that it is 100% user interface driven and shows something different, I wanted to show it as well.  I've labelled this one as Pivot & Merge.

Here's the steps:

  • Reference the Source Data query
  • Remove all columns except Customer & Membership
  • Select both columns --> Remove Duplicates
  • Pivot the Customer column (to get of products by customer)

image

  • Demote the headers to first row
  • Transpose the table

And at that point, you have this view:

image

Look familiar?  You can now finish this one using the steps in "Method 1 Completion" above.

Personally, I don't think I'd go this route, only because the Pivot/Transpose could be costly with large amounts of data.  (To be fair, I haven't tested any of these solutions with big data.)  But it is cool so see that there are multiple ways to approach this.

The Double Grouping Solution to Power Query Challenge #2

This is the solution that I cooked up originally, and is actually why I threw this challenge out.  I was curious how many people would come up with this, and only a couple of people put this out there.  So here's how it works:

  • Reference the Source Data query
  • Stage 1 grouping:
    • Group the data by Customer and Membership
    • Add a column called "Transactions" using the All Rows operation

This leaves you here:

image

Now, you immediately group it again using a different configuration:

  • Group by Customer
  • Add columns as follows:
    • "Products" using the Count Distinct Rows operation
    • "Data" using the All Rows operation

Which leaves you at this stage:

image

It's now similar to what you've seen above, but we have a nested table that contains our original data.  To finish this off, we now need to do this:

  • Filter Products to Greater than 1
  • Expand only the Transactions column from the Data column
  • Right click the Transactions column --> Remove Other Columns
  • Expand all fields from the Transactions column
  • Set the data types for all the columns

And you're there!

image

Final Thoughts

Again, there were more solutions submitted for Power Query Challenge #2.  We had:

  • A couple of custom function submissions (of which each was slightly different)
  • A couple of custom grouping solutions (not written through the UI)
  • A couple of solutions that used grouping, then used a custom column to create a table based on the grouped output which filtered to distinct items

If I haven't covered yours here and you feel that I missed something important, please drop it in the comments below!

The part that fascinates me most about this is that we had UI driven submissions involving merging, transposing and grouping.  Three different methods to get into the same end result.

Thanks for the submissions everyone!