Task Tracking with Power Query

Did you know Power Query can be used as a task tracking tool? This might sound quite unusual but the method described here has been used for solving a real business case. The example I will use is rather simplified but still close to reality, and will demonstrate how to build task tracking with Power Query.

Laying out the Scenario

Vicky is a manager of a small team that is dealing with customer questions on various topics. One of her duties is to distribute various questions among her subordinates. After that, each of them should take some actions and report what is the status of each task.

The problem is – how can each employee can see what tasks are assigned to him/her and fill in the respective information for each task? At the same time, Vicky should at any moment be able to assign a new task and review the statuses of old ones. This is the table Vicky needs:
Task Tracking with Power Query

Unfortunately, she has no other tool at hand except Excel. Luckily, she can set up task tracking with Power Query right in Excel, which could work perfectly in this case.

Setting up Task Tracking with Power Query

So let's start building the solution.

1. Load the left table (in this example, called Filled by Manager) into Power Query.
Manager's table to assign tasks

2. Next, create one query for each employee by filtering the Employee column.
Create individual employee queries

3. Load each Employee table into a separate Excel sheet. (Of course they can be on different files linked to the source table).
Sample employee table showing tasks assigned

4. Then, create a table for each employee to fill in the actions and statuses.
Employee's task tracking worksheet

You can see in the above picture what each employee will have in his/her worksheet - a green table on the left with the tasks assigned to them, and a yellow table on the right where he/she has to fill in the respective information.

Creating the Filled By Employees Table

5. Load all the Employee tables into Power Query.
Load all the employee tables into Power Query

6. Append them in a new query (in this example, called Statuses).
Append all the employee tables into new Statuses query

You are probably guessing what the next step is – load the Statuses query into Excel right next to the Filled By Manager table

However, the result is not what we would expect.
The Filled by Manager table is not matching the newly loaded Fill by Employees table

Note that on first row of the Manager’s table is a task assigned to Ivan on 27.01.2019, but row 1 of the Employee’s table shows the task assigned to Maria on 09.02.2019.

In order to fix this mess, we need one additional query.

Building the Task Code Query

7. Once again, load the Manager’s table into Power Query and remove all columns except for Task Code.
Task Code Column

Task Code is a unique identifier of each task. In this example, it is simply composed of the employee's name and the number of occurrences of this name in the register up to the respective row. In Excel language, the formula is:
Use COUNTIIF to create unique task identifiers

The trick is that we fix the first row of column F (containing the employees' names) but the end row is not fixed.

8. Merge the Register Employees and Statuses queries together.
Merge the Register Employees and Statuses tables

9. Finally, expand the table and voila - it is in the required order. The only thing left is to load it back into the Manager’s table.
Final table for task tracking with Power Query

Now, any time she needs to, Vicky can refresh the Filled by Employees table and see the updated statuses of each task.

Likewise, each one of her subordinates can simply refresh the Manager’s table (the green one that is on left of his/her tab) to see any new tasks that have been assigned.

You could also automate the refresh operation VBA. For more details, refer to Chapter 16 of Ken's M is for (Data) Monkey book.

Final Words

This article presents nothing new and unusual as a Power Query technique. What is new and unusual is the way Power Query has been used for solving a typical business problem. This is just additional proof of how powerful and useful this tool is.

You can find the file with example here: Task tracking with PQ

Become a Data Master with Power Query

Ken is really excited to be teaching his popular Master Your Data with Power Query class for the first time in New York City! Join us this spring for a small group hands-on workshop and learn how to become a Data Master with Power Query.

Learn to become a Data Master with Power Query

Join Ken Puls for a live hands-on session in New York, NY on April 17, 2019.

What's so great about Power Query?

If there is one thing you need to learn in Excel today, Power Query is it. With Power Query, you can clean, reshape, and combine your data with ease. No more tedious cutting and pasting between multiple files. No more manually removing garbage rows or adding new columns. And no more repeating the same time-consuming steps whenever you need to refresh the data.

Instead, once you have your data the way you want it, all you have to do is click refresh and it will be ready to be loaded into the next day's/week's/month's/quarter's report. You can even schedule these refreshes to happen automatically!

What will be covered in the workshop?

The day will begin with a quick overview of Excel tables, PivotTables, and what makes "good" data. Next, Ken will show you how to import data from a wide variety of files, including Excel workbooks, CSV and TXT files, databases, and even entire folders. You'll be able to clean, transform, and refresh your data in Power Query with just a few clicks.

Ken will also show you how to append (or stack) data from multiple tables and 7 ways to merge (or join) tables without any VLOOKUPs. You'll be able to pivot data like this:

Pivoting Data with Power Query

and unpivot data like this:

Unpivoting Data with Power Query

But wait, there's more!

Ken will teach you some more advanced techniques using conditional logic. He will also give an overview of best practices for structuring your queries and query folding. Additionally, you'll receive a copy of the course slides to refer back to. Many of these slides contain handy recipes that will lead you step-by-step through the data transformation techniques.

Not only that, you will be able to directly ask a leading Power Query expert to help you with challenges you are currently facing with your own data. That kind of in-person access is invaluable!

How can Power Query help me?

A data wrangler spends the majority of their time just gathering, cleaning, and preparing the data before it can be even used in a report, chart, or other data model. Instead, become a Data Master with Power Query and get hours of your time back. For example, Ken was able to help a workshop attendee automate a workflow in 30 seconds, saving them 6 hours per week!

Power Query is the data preparation tool of the future, not only for Excel but also for Power BI Desktop, Microsoft Flow, and more. Thus, everything you learn in this course is transferable to other technologies - giving you more bang for your buck.

How do I make the case to my boss for sending me to the course?

As Ken mentioned in a previous blog post, the cost of the course can look like a lot up front. This is especially true if you must also pay for travel, hotel, etc. But divide the $499 USD registration fee + any expenses by your hourly rate. You'll see a return on investment pretty quickly with the time you save using Power Query.

However, the real value of the training comes in when you look at what you can do for your company with that extra time. You can now focus on analyzing the data instead of preparing it. See, Power Query turns your data into information. It allows you to identify new opportunities, make better decisions, and add real value to your organization.

Where do I go to become a Data Master with Power Query?

Ken will be leading this full-day workshop on Wednesday, April 17, 2019 at the NYC Seminar and Conference Center in New York City. But this is a small group session so there are only a limited number of spots available. Register today to secure your chance to receive personal guidance from a world-class Power Query expert. Go to the Excelguru website to view the full course description and register online.

Cache Shared Nodes Fix is Live

At long last, we have confirmation that the Cache Shared Nodes Fix is live in Excel.  If you're not familiar with this issue, it's one of the most important changes implemented in Power Query in quite some time.  You can read more about the issue in my guest post on Rob Collie's blog here.

What versions of Excel will get the Cache Shared Nodes Fix?

The Cache Shared Nodes fix is available to:

  • Office 365 subscribers
  • Excel 2019 (non-subscription) versions

This leaves you with the inefficient multi-refresh challenge if you are using Excel 2010, Excel 2013 or Excel 2016.  My understanding is that Microsoft does not intend to back port to these versions.  What that means to you is that in in order to get the fix, you will need to upgrade to a newer version.

Do I have the Cache Shared Nodes Fix?

You need to be running Excel 16.0.10726.* to have the update.  To check if you have it, go to File --> Account --> About Excel.  Your current version and build are listed at the top:

Office 365 Insider's Build

Excel 2019 Professional Plus (non-subscription)

How do I update my Excel 365/2019 to get the Cache Shared Nodes Fix?

For users of Excel 2019, make sure your Windows Update settings include the advanced option to get updates for other Microsoft Software.  If your version is not updated yet, it should come through on your next update cycle.

For users of Office 365, you should actually already have the fix in place.  If not, go to File --> Account --> Update Options.

(There is a possible exception for Office 365 if you're running on the Deferred Channel for updates.  If that's the case, you either need to get onto a more current channel, or... wait until the deferred channel also has the fix.)

Building BI in Excel Course

Are you interested in learning how to clean and shape data with Power Query, as well as how to model it using Power Pivot? Don’t know which of these mysterious skills to tackle first? Want to learn about building BI in Excel where you create refreshable and maintainable solutions?

Good news: Ken Puls will be in Wellington, New Zealand on February 25-26, 2019 leading a live 2-day, hands-on session covering these essential skills!

What does Building BI in Excel cover?

In Day 1, you’ll learn how Power Query can clean up, reshape and combine your data with ease – no matter where it comes from. You can convert ASCII files into tables, combine multiple text files in one shot, and even un-pivot data. These techniques are not only simple, but an investment in the future! With Power Query’s robust feature set at your fingertips, and your prepared data, you can begin building BI in Excel using Power Pivot. The best part is that these dynamic business intelligence models are refreshable with a single click.

Un-pivoting Subcategorized Data

Un-pivoting subcategorized data is easy with Power Query

Day 2 focuses on Power Pivot, a technology that is revolutionizing the way that we look at data inside Microsoft Excel. Power Pivot allows you to link multiple tables together without a single VLOOKUP statement. It also enables you to pull data together from different tables, databases, the web, and other sources like never before. But this just scratches the surface! We'll also focus on proper dimensional modeling techniques and working with DAX formulas to report on your data the way you need to see it.

Top Selling Servers Report

Build dynamic reports that are easy to filter and refresh

Who is this course for?

Building BI in Excel is for anyone who regularly spends hours gathering, cleaning and/or consolidating data in Excel. It's also valuable for anyone responsible for building and maintaining reports. Participants must have experience using PivotTables. Some exposure to Power Pivot and Power Query is not required but is a bonus.

Where do I sign up?

We are offering this course in conjunction with Auldhouse, a leading computer training company in New Zealand. Go to the Auldhouse site and use the following promo code EARLYBIRD20 to give yourself a 20% discount.

This will knock $300 NZ off the course, bringing it down to $1200. That’s $600 per day for pretty much the best introduction to both Power Query and Power Pivot that money can buy! Then use your new skills to free up 90% of your data-wrangling time, giving you time to negotiate a 20% pay increase*. Unbeatable ROI!

Don't miss out, the early bird discount is only available until January 31, 2019! Visit the Auldhouse site today for full details and registration.

*Numbers are indicative only, your mileage may vary. Heck, it may be way better than that!

Analyzing text with Power Query and Excel

We all know how good Power Query is for cleaning, shaping and analyzing numerical data. But this powerful tool opens doors for analysts that had been closed for long time, such as for analyzing text. In this article, I will give you some ideas how Power Query can be used for analyzing text that is unstructured. This is a new area for me as well so I’ll be grateful for any comments, thoughts, ideas and shared experience that will help to elaborate further on the methodology.

Our scenario

First, let’s put things in a context with simple but realistic example. Assume you are the newly appointed Marketing Manager of a 4-star SPA hotel. You want to analyze reviews about the hotel in sites like booking.com and tripadvisor.com to understand the major service issues. Thanks to Power Query, you will be able to extract, clean and shape data from those sites to receive a nice table like this:

Comments table

* Table contains ~300 real opinions for several Bulgarian SPA hotels in Hissarya, Velingrad, Sandanski and Bansko. Source: booking.com and tripadvisor.com

But how can you get some insights from this data? Obviously you will spend lots of time to read all reviews. Even if you do that, you need to summarize the information somehow. So let’s play a little within Power Query and see how it can help us analyze text.

Preparing data for analysis

First, we load the table in Power Query (From Table) and clean it of punctuation symbols like commas, semicolons, dashes, brackets, etc., as well as replace capital letters with lower cased one. Why do that last one? Because we want to count the most frequently used words and for Power Query ‘hotel’, ‘hotel,’, ‘Hotel’ are different words.

The cleaning can easily be done with tools like Replace Values, Trim, and Lowercase. I believe they are well-known to average user of Power Query so will I skip detailed operations. Instead, here some lessons learnt from practice:

  • First, duplicate the Opinions column so you can have original text.
  • Then, replace all punctuation symbols with a space.
    • Later we will use a period a separator to split opinions into sentences. And since a sentence can finish with exclamation mark or a question mark it is useful to replace ‘!’ and ‘?’ with ‘.’
  • Since this is a hotel, clients may use the symbol ‘*’ in their opinions (i.e. ‘nice 4* hotel’ or ‘definitely not a 4* hotel’). So don’t replace this symbol.

Above points are nice examples that when analyzing text or other data you need to do it in a context and should explore it first, but that is another story.

Here are the steps applied so far. As a final touch the text has been trimmed and cleaned.

Steps taken to clean data

Separating each comment

Our next step is to split the column using a period ('.') as a delimiter, and doing so at each occurrence of the delimiter.

Split columns by delimiter

Power Query creates a number of new columns, each one containing a sentence. I bet you are guessing what will follow. That’s right – select the original two columns and choose Unpivot Other Columns.

Unpivot other columns

After removing the Attribute column and filtering out rows that are blank, our table has a column for the original comments (Opinion) and a column for each sentence contained in the original comments (Sentence). Please be patient here, and you will understand why this is important.

Table containing full comments and each individual sentence

Let’s now duplicate the Sentence columns and repeat the above actions: split each sentence into its individual words, then unpivot. Below is the table after removing the Attribute column and filtering out blank rows:

Table containing full comments, each individual sentence, and each word

As a next step I will add custom column with a simple formula that calculates the length of each word. We can use it to remove common words that bring no value to our analysis – ‘a’, ‘the’, ‘is’, ‘or’, ‘but’ etc.

Custom function to calculate the length of each word

Finally, filter to exclude words that are 1, 2 or 3 letters long. We will use the resulting table as the basis for analyzing text.

Table filtered for words greater than 3 letters

Finding Lookup words

As a first step in analyzing text, let’s Duplicate the query (you may use Reference as well).

Duplicating the query

In the new Query, simply use the Group By function on the Words column to count the rows.

Grouping by the Word column

Thus, we will have a list with all words (above 3 letters long) and how many times they have been used. For convenience, we can sort this and then load it into Excel.

Word count table

Let’s take a look at the list. Obviously it contains words like ‘with’, ‘this’, ‘have’ that we need to ignore because they could be used in many different situations. On the other hand we see ‘good’ and ‘nice’ in the top 20. Those words are highly subjective – what is good for me might not be good for you. Besides, we don’t know whether the reviewer said ‘good food and nice service’ or ‘the food was not good and service far from nice’. So let’s skip them for now.

Apart from the above, we see that customers mention most often words like ‘room’, ‘pool’, staff’, ‘food’. These are our words of interest (I call them ‘Lookup words’). We may now filter all sentences that contain these words, allowing us to read much less information compared to original list of opinions.

Honestly speaking, if you have enough experience or even some common sense, you can skip this step. I mean it is more than expected that ‘room’, ‘staff’ and ‘pool’ are most often mentioned in reviews for a SPA hotel. However, identifying lookup words might be needed for cases like analyzing text of more open-ended questions.

Filtering Lookup words

So far we have identified our words of interest and listed them in a table:

List of lookup words

So how do we filter all sentences containing these lookup words? We can make it dynamic by creating a custom function. The process is similar to one used to load data from multiple internet pages, as described in M is for (Data) Monkey.

First, we create a query to Filter Rows of the Word column that contain one of our lookup words, ‘staff’ for example.

Filtering for a look up word

Then we go to Home > Advance Editor and change the M code a bit as shown below:

Updating the M code

After we create the function, we rename it as something more user-friendly (I used fxFilter).

Then we load our LookupWords table into Power Query (From Table):

LookupWords table

Next, we Add Custom Column. (Note: in Excel 2016, you have to Invoke Custom Function.)

Adding a custom column

And when we expand the table, we get all rows that contain the words from our lookup list.

Identifying Connected words

We now need to split the Sentence column again to identify the words that were most often used in sentences containing our Lookup words. I call these words ‘Connected words’.

There are some final steps such as eliminating words that appear rarely (i.e. less than 5 times), removing duplicate rows, rearranging columns etc. that I will skip.

Our final table is on the left below:

Final table for analyzing the comments

Analyzing text

As you see, I added two slicers to our table - one for our Lookup words and one for our Connected words. By using them in combination, we can get more insights quite easily. For example, these are the words used most frequently together with ‘staff’:

Slicers used to identify comments regarding staff

Here, I have filtered for the Connected word ‘friendly’, which gives us just 10 rows to read:

Analyzing text for the word 'friendly'

In a separate column, we can manually type “Y” for each positive opinion and “N” for each negative. Now we can calculate that 30% of mentions are actually negative. This is a simple way to quantify and measure our data. And it takes just few minutes to find out that the picture is similar with ‘helpful’ and ‘polite’ – positive feedback is prevailing.

When we filter ‘English’ it is easy to see that 8 people mentioned staff speaks no or poor English. That directly calls for some action - we need to train our staff in English. If you were reading through 300 opinions, you might miss this quite easily.

Analyzing text containing 'staff' and 'English'

It takes just few clicks and couple of minutes or reading to find out that clients are happy with the breakfast:

Analyzing text for comments on breakfast

But are disappointed by the prices and quality of food in the restaurant:

Analyzing text for comments on the restaurant

Of course this is just a starting point and each of the above observations needs further investigation and analysis. But the key takeaway here is that we can successfully use Power Query to summarize unstructured text data. We can then focus our attention on sentences that contain the most commonly used words instead of reading the whole bunch of reviews.

Applying the technique

Many other ideas come up into my mind of how we can use this for further analyzing text data. We can use the Stayed column to study trends in time, we can quantify some key words (as shown with 'friendly') or make a word cloud showing our top 20 words. What are your ideas - how can we use Power Query and Excel to analyse unstructured text?

Nuthin’ ain’t nuthin’ in Power Query

There are two kinds of nuthin' in Power Query: null, and blank. I tripped on this issue the other day, and Ken thought it would be a good idea for a blog post.

Let's just call out the two types of nuthin' in Power Query:

  • null is literally "no value" for any data type from text to table.  In other words, the cell is completely empty.
  • A blank also has looks like "no value", but it is equivalent to a cell formula of  ="" in Excel.  In other words, the cell holds a value that renders as blank.

Why is this important?  It's because, inside Power Query (and indeed many programming languages) null and blank are not equal!  And it turns out that nuthin' matters more than getting the right nuthin' in Power Query!

Burned by nuthin' in Power Query

Some time ago I built a set of Excel Power Query transforms which report on data extracted from a client system. For some time the client data has been extracted into Excel files, but there were some problems. My solution was to extract the client data files as CSVs instead.

What I found, unexpectedly, was that blank data values are treated differently by the PQ import functions depending on whether the file being imported is an Excel file or a CSV file!

Here's the rules:

  • For an Excel import, blanks are converted to null - always.
  • For CSVs, blanks are imported as blanks (not nulls). But when a field type is changed in a Power Query step the numeric and date column blanks are converted to nulls, and text column blanks remain as blanks (not nulls)

Seeing nuthin' in Power Query

What does it matter?  Have a look at the following cases...

1. Excel data with blanks loaded into PQ. Blank cells are imported as null.

2. CSV data with blanks imported to Power Query: Blanks are read as blanks.

3. CSV data with blanks: dates and numbers change to null after type change. Text blanks remain blank

The impacts of nuthin' in Power Query

Nuthin' in Power Query could have caused me more issues here... my transforms used conditional columns to check for nulls in text columns, a test which failed when the value is blank.

As Ken pointed out, another key issue is that the Fill Down and Fill Up functions in Power Query are used to fill null values. But blanks are not nulls, so the functions do not work as intended in my case.

How to deal? Caveat Emptor! It appears Power Query treats blank data differently in the CSV import and Excel import functions.

So, if you are building transforms based on a consistent source of imported data, then there is little impact. But if you should need to change the type of data source like I did then beware. I had to do some detail testing, and I was looking to re-write some longstanding and well tested transforms.

Instead, for my specific situation, I added a value change step to the CSV import in PQ to replace nuthin' (blank) with the keyword null for the whole data set. That allowed me to maintain my downstream logic, but cost me some processing speed.

I hope this alerts PQ users to a potential issue in their transforms. If there are any alternate solutions to the problem let's see them in the comments.

Power Query Recipes Now Available

As you may have heard, we have been working on a set of helpful Power Query Recipes for Excel and Power BI and are excited to announce that they are now officially available!

Are our Power Query Recipes for you?

The Power Query Recipes are targeted at people who are familiar with the Power Query interface (in either Excel or Power BI), and will lead you step-by-step through the process needed to clean up and convert your data from one format into another.  And if you're not already comfortable with Power Query?  Consider joining the Power Query Academy so we can change that!

So what is in the Power Query Recipes package?

There are currently over 30 cards in the set, showing easy-to-follow steps that will deal with a variety of common data issues.  I personally am finding them super useful, and often refer back to them when I'm helping people clean up their data in person or in forums.

We've even marked each of the Power Query Recipes that has a video version in Power Query Academy.  If you are already subscribed to our Academy, simply click the video camera in the bottom corner, and it will take you straight to the appropriate video so that you can see the technique demonstrated in a live setting.

Are there samples of the Power Query Recipes?

Of course there are!

Here's an example of one of my favorites, which lays out how to create Full Anti Join, something that does not exist in the regular Power Query user interface:

image

And another which shows how to create a Calendar Table on the fly:

image

We actually have a sample package available which includes four recipe cards (including the precedent card for the calendar recipe shown above), as well as the full table of contents for the current version.

Just some of the patterns included in the full version are:

  • Splitting data into new columns or rows
  • Pivoting, unpivoting, and transposing data
  • Several different ways to merge tables (including the Full Anti Join shown above)
  • Extracting a new column based on values in the prior or next rows
  • Six different ranking methods
  • Creating dynamic calendar tables
  • Adding a random number to all rows

Will there be more Power Query Recipes?

Oh yes!  While there are already over 30 recipes in the set, we already have another 20 on our "to-do" list, and are adding to that list regularly.  Our intention is to release new recipes on a regular basis, putting even more Power Query resources at your fingertips.

How do I get my copy of the Power Query Recipes?

There are a few different options:

  1. If you are a Power Query Academy member*, the recipes are already included in your registration.
  1. Pre-order a copy of the new 2nd edition of our M is for Data Monkey book from Amazon by September 30, 2018 and receive a FREE set of the initial  recipes.**
  1. Purchase the current stand-alone set of recipes from the Power Query Training web store for $24.95 USD. This is also an option to add a subscription ($2.95 USD every 3 months) so that you get all the new recipe cards as we expand the set.

* Please note that the recipe cards are not included in the free Power Query Academy trial, only with the full course.
**Upon receipt of your Amazon proof of purchase being received via email at support@powerquery.training, we will provide a coupon code for $24.95 which can be applied to either the stand-alone or subscription purchase.  Act fast though, as this offer ends on October 31, 2018.

Disaster Recovery in Puerto Rico with Power Query

When Ken was at the Microsoft Business Applications Summit a few weeks ago, he met Mr. J.A. Garcia who has been doing some amazing work with Power Query. We wanted to share his story about how he has been using Power Query in helping with disaster recovery efforts in Puerto Rico:

"[In] my line of work there's been two defining moments that have changed the way we look at our tools. The first one was the Zika outbreak and the second one was Hurricane Maria.

The first time I saw Power Query was [as part of] Power BI during the Zika outbreak [in 2016]. One of our clients needed up-to-date information of the Zika outbreak and its effect on healthcare. With the help of a consultant, we started using Power BI and Power Query.

Aedes aegypti mosquito

An Aedes aegypti mosquito, one of the main transmitters of Zika virus.

I began taking courses during that time, and one of them was about Excel. That's when I learned about Get & Transform in Excel 2016.

Any new job that I received, I tried to use Power Query. I taught myself SQL so I could understand better the process of extracting data and how to integrate it into Power Query.

Our job was changing. We could give the tools to our clients that would let them refresh when they needed it the most. No more waiting [on] our area for a data refresh!

Then Hurricane Maria hit Puerto Rico [in September/October 2017]. It was a harsh two weeks of no communication. As soon as I came back from work, I noticed the change in attitude. As a healthcare company, we began doing Public Health.

Hurricane Maria - Disaster Recovery with Power Query

Hurricane Maria is regarded as being the worst natural disaster on record to affect Dominica and Puerto Rico and the deadliest Atlantic hurricane since Hurricane Stan in 2005.

My main job was identifying members with certain serious conditions. I used Power Query and Excel to create processes that obtain information from the assessment done to keep track of the efforts of the company. The clients could refresh the data and see who was missing, fix any data entry errors and more.

I'm very proud of my work, and Power Query in Excel and Power BI has been a large part of my growth. In the present, we have created a tool that refreshes constantly to help identify members with serious conditions. Now in case of any emergency, we'll know who to attend."

~ J.A. Garcia

We were very inspired how Mr. Garcia began is Power Query journey as part of the disaster recovery efforts after these emergencies, and that he and his team continue to leverage this powerful tool in both Excel and Power BI. Power Query really can help save lives!

Do you have an story to share about your Power Query journey? Maybe it hasn't saved your life literally, but perhaps it has saved you hours of time and effort, a significant amount of money, or even your sanity! Let us know in the comments below or contact us through the Excelguru site.

Update on the Master Your Data Book (Data Monkey v2)

Miguel and I were at the Microsoft Business Applications Summit last week, and we were frequently asked for an update on the Master Your Data book (aka M is for Data Monkey version 2.0). We were told that it’s time. People pointed out that they had pre-ordered it on Amazon ages ago. Enthusiasts asked why we don’t have a subscription model with monthly updates like Power Query does.

Master Your Data Book Cover

There’s a hunger to see the new version. We’re flattered that you rely on us, and honestly, we’re gutted that you are still waiting for it. And after attending the summit, we know that we need to give you an update on the Master Your Data book.

Some background on publishing…

Before we tell you what’s happening, I’d like to just explain a bit about the back story on what we have to consider when we write books on technology. The primary factors are:

  1. How to fit it in with our schedules. Even over the long term, books don’t come close to earning anywhere near the financial rewards of just dedicating time to consulting projects. (This is a big factor in the subscription question.)
  2. In today’s world of constant updates, we know that there are new features added on a monthly basis. The question is, which ones are serious enough to cause us to delay the release?

If you look back at M is for Data Monkey, we are really proud of its long-term value and continued relevance. It kills us that merges aren’t in there… they came out a few days after the book went to print. Would we have held the book for them if we had known? Yes. Would we have held the book for conditional columns? No. This is just one of the kinds of decisions we have to face.

No matter which way we go, we’ll always wish we waited for the next great feature. And we can’t. We know that. But our goal is to make sure that the material inside the book stands the test of time well and continues to hold relevance as features are added and changed. We believe that we’ve managed to do that with M is for Data Monkey fairly well, all things considered. Are there easier ways to do some things today? Yes. But does the book give you a deeper understanding and still let you accomplish the same goals? We believe it does.

The factors that lead to delays

Features are one thing. They generally add new functionality. But User Interface changes are something else entirely…

Earlier this year, we made the call to delay the Master Your Data book in order to get a clearer picture on what Excel 2019 was going to look like. We needed to know which Power Query features would be there, and which wouldn’t. It just doesn’t make sense to publish a book around the same time of Excel 2019’s release with Excel 2016 screen shots. We’ve already been told that some stuff in M is for Data Monkey looks “dated”. We certainly didn’t want the new Master Your Data book to be “dated” on the day of release due to a User Interface change.

And now, at the Microsoft Business Applications summit, we saw a preview of what is targeted for release into Power Query in the next few months. These new features are significant, they are impactful. But most relevant to us is that they contain a significant change to the Power Query User Interface. They will affect every single screen we use. They will affect every single screen shot we take. And if we don’t wait, we will deliver to you a brand-new book that has pictures that don’t look anything like the User Interface you see on screen. Even if we were to push material to the publisher today, it takes 2-3 months to get the book to Amazon, so best case, you get 2-3 months' use out of the book.

To us, that is irresponsible. We refuse to take your money and deliver you a substandard product. It’s just not right.

Just how significant were the MS Biz Apps announcements?

If you weren’t at the Microsoft Business Applications Summit, you might not know about these announcements. You can read the full list here, but let’s recap the key ones for us here. We can divide them into two categories:

  1. Awesome-but-not-critical (i.e. we would cry because we couldn’t include these, but wouldn’t delay the book for them)
    • New data connectors (including extract from PDF)
    • Fuzzy lookup
  2. Critical features (stuff that must be in the Excel version of Power Query before we can test material, write about nuances, shoot images and release the book)
    • Data profiling (quality) previews
    • M Intellisense in the Advanced Editor, formula bar and Add Custom Column window

These last two features will have a significant effect on the images of the book, as you can see here:

Power Query UI Preview

What is the revised timeline for the Master Your Data book?

The new Power Query features are estimated to arrive in Power BI Desktop by October 2018. And based on the historical pattern, these features will show up in Excel within 2-3 months of their Power BI release date. Giving us time to test the new features, take screenshots, revamp the book order to best tell the data story the way we need to… We are hoping to have the book in print by the end of Q1 2019. It’s still aggressive on our side, but that is our refined target. If the builds ship later, or things take longer then anticipated, it could slide it into Q2.

Yes, we know it’s a long way away. We know you’ve been waiting, and we wish it could be faster. But again, we hope that you understand that we are doing this to truly give you the best book that will last longer than it could otherwise.

In the mean time – can we give you something else?

We have been working on another product as well: Master Your Data in Excel & Power BI recipe cards.

Naturally, all members of our Power Query Academy will get a free copy of these. And due to the significant delay of the book, we’d also like to offer a free copy to anyone who has pre-ordered Master Your Data on Amazon. (More on this below.)

So, why can we do these, but not the book? It’s because they assume you already know Power Query’s User Interface, so only provide the steps on how to accomplish the goal. Loaded with before and after pictures, and the route to get from one to the other, we aren’t bound by User Interface design changes.

Here’s a quick sample of one of the cards:

Power Query Recipe: Pivoting Stacked Data

And another:

Power Query Recipe: Split Records into Columns

How will these be sold?

Ultimately, we plan on selling this product on a subscription basis through our web store as follows:

  • $14.99 for the purchase of the downloadable card set
  • $2.99/quarter for a subscription to updates

We already have 26 cards designed, with more on the way. As we expand the set, the original purchase will include the new cards. But for those on subscription, we will update your original purchase and give you access to the new cards when we release them. We’re not intending to hold these for quarterly release, but rather send a new one every time we build it. You might get five in one quarter and one in the next, but our intention is to keep delivering new patterns as we discover them and build summary tips cards to illustrate them.

Wait… didn’t you say subscription doesn’t work for publishers?

For books, yes, it’s really hard. They’re complicated and require ensuring that things are taught in the right order, with all the updated techniques along the way. These Master Your Data recipe cards are snapshots of what to do in certain data cases when working with data in Excel and Power BI, so are a much more refined scope.

It’s way easier for us to update cards, or add entirely new ones, as it doesn’t require re-writing precedent chapters. So in this case, it makes sense, as we can provide an initial catalogue of patterns, and add more over time. We’ve already got ideas for a bunch more to expand this set.

How can I get my hands on the Master Your Data Recipe Card set?

There are a few ways…

For those of you who are members of the Power Query Academy, we will add the tip cards as a resource as soon as we have them ready. It’s part of your subscription so, as long as you’re still an Academy member, you’ll get all the new ones we create. In addition, we will also make sure you get a copy of the new Master Your Data book as soon as it is released (even if your subscription has expired and you’re no longer an Academy member).

For those of you who have pre-ordered our new book on Amazon, please follow the Excelguru blog. We will post when the recipe cards are ready and will let you know what you need to do to get your free download of the initial package of cards. The subscription for updates will be available as well, but will be entirely optional.

And if you’ve just been waiting for the Master Your Data book and haven’t purchased yet, all good. We’ll be setting these up in an online store to allow you to buy the download version and (optionally) sign up for the updates as well.

When will the Master Your Data Recipe Card set be available?

Soon. We are in final design for the card set now and need to set up our web store to handle subscriptions. Our target is to have that all done by September 15, 2018, if not earlier. Keep watching here for the official announcement.

Ultimately…

…we wish we could send the Master Your Data book to you today, but hope that this will make a reasonable substitute to get you over the hump until we can. Thank you for your patience, understanding and trust in us as we work to deliver you the best version we possibly can.

Power Pivot eBook Coming Soon

It's been a long time coming, but we are putting the finishing touches on the third installment of our free 'DIY BI' series. Consequently, we are excited to announce that the Power Pivot eBook will be officially released on Tuesday, July 3, 2018!

Power Pivot eBook

This brand new book will feature five of Ken's top tips, tricks, and techniques for Power Pivot, including:

  • Hiding fields from a user
  • Hiding zeros in a measure
  • Using DAX variables
  • Retrieving a value from an Excel slicer
  • Comparing data using one field on multiple slicers

Power Pivot eBook

 

About the 'DIY BI' Series

This free eBook series is available to anyone who signs up for the monthly(ish) Excelguru email newsletter. The series includes four books, one edition each for Excel, Power Query, Power Pivot, and Power BI. Each book contains five of our favourite tips, tricks, and techniques which Ken developed over years of research and real-world experience.

DIYBI eBook Series

We first launched this series in the spring of 2017 with the Excel Edition, and the Power Query edition followed later that summer. You can read some more about why Ken decided to create this series in his initial blog post about it.

The Excelguru Newsletter

The monthly Excelguru email newsletter features the latest updates for Excel and Power BI, as well as upcoming training sessions and events, new products, and other information that might be of interest to the Excel and Power BI community.

Don't Miss Out, Get Your Free Copy of the Series

If you're not already a newsletter subscriber, you can sign up here. We will send you the Excel Edition right away, and the Power Query Edition a few days later. All of our current and new subscribers will receive the Power Pivot edition once it is released on July 3, 2018. Be sure to keep an eye on your inbox for the new book.

We will be continuing to work on the fourth and final book, the Power BI Edition, over the coming months so stay tuned for details!