Treat Consecutive Delimiters As One

A couple of weeks back, I received a comment on my “Force Power Query to Import as a Text File” blog post.  Rudi mentioned that he’d really like to see a method that would let us Treat Consecutive Delimiters as One; something that is actually pretty easy to find in Excel’s old Import as Text file feature.

…the option for “Treat consecutive delimiters as one” should be made available too. The text import eventually imported but the info was all out of sync due to the tabbed nature of the data – names longer and shorter cause other data to not line up.

So let’s take a look at this issue today.  You can access and download the sample file I’ll be working with from this link.

The issue at hand

The contents of the text file, when viewed in Notepad, are fairly simple:

image

But the challenge shows up when we go to import it into Power Query.  Let’s do that now:

  • Power Query –> From File –> From Text
  • Browse to where you stored the file and double click it to Open it up

The first indication that we have an issue is that we’ve got a single column, the second is that it’s a bit scattered:

image

Well no problem, it’s a Tab delimited file, so we’ll just split it by Tabs, and all should be good, right?

  • Home –> Split Column –> By Delimiter
  • Choose Tab and click OK

image

Uh oh…  What happened here?

The issue is that we’ve got multiple delimiters acting between fields here.  They look like one, but they’re not.  Between Date and Vendor, for example, there are two quotes.  But between the Date values and the Vendor values only one Tab is needed to line it up in the text document.  The result is this crazy mess.

Approaches to the Issue

I can see three potential routes to deal with this problem:

  1. We could replace all instances of 2 Tab’s with a single Tab.  We’d need to do that a few times to ensure that we don’t turn 4 Tabs into 2 and leave it though.
  2. We could write a function to try and handle this.  I’m sure that this can be done, although I decided not to go that route this time.
  3. We could split the columns using the method that I’m going to outline below.

Before we start, let’s kill off the last two steps in the “Applied Steps” section, and get back to the raw import that we had at the very beginning.  (Delete all steps except the “Source” step in the Applied Steps window.)

How to Treat Consecutive Delimiters As One

The method to do this is a three step process:

  1. Split by the Left Most delimiter (only)
  2. Trim the new column
  3. Repeat until all columns are separated

So let’s do it:

  • Select Column1
  • Split Column –> By Delimiter –> Tab –> At the left-most delimiter

image

Good, but see that blank space before Vendor?  That’s one of those delimiters that we don’t want.  But check what happens when you trim it…

  • Select Column1.2
  • Transform—>Format –> Trim

And look at that… it’s gone!

image

Let’s try this again:

  • Select Column1.2
  • Split Column –> By Delimiter –> Tab –> At the left-most delimiter

image

And trim the resulting column again:

  • Select Column1.2.2
  • Transform—>Format –> Trim

And we’re good.  Now to just do a final bit of cleanup:

  • Transform –> Use First Row as Headers
  • Name the Query

And we’re finished:

image

Easy as Pie?

In this case, yes it was.  Easy, if a bit painful.  But what about spaces?  If this file was space delimited I couldn’t have done this, as my Vendors all have spaces in them too.  So then what?

I’d modify my procedure just a bit:

  1. Replace all instances of double spaces with the | character
  2. Split by the left most | character
  3. Replace all | characters with spaces
  4. Trim the new column
  5. Repeat until finished

Final Thought

I haven’t received too much in the way of data like this, as most of my systems dump data into properly delimited text or csv files, but I can certainly see where this is an issue for people.  So I totally agree that it would be nice if there were an easier way to treat consecutive delimiters as one in Power Query.

I do like the versatility of the choices we have currently, but adding this as another option that works in combination with the existing ones would be fantastic.

Power Query team: here’s my suggestion…

image

:)

Force Power Query to Import as a Text File

I’ve run into this issue in the past, and also got an email about this issue this past week as well, so I figured it’s worth taking a look.  Power Query takes certain liberties when importing a file, assuming it knows what type of file it is.  The problem is that sometimes this doesn’t work as expected, and you need to be able to force Power Query to import as a text file, not the file format that Power Query assumes you have.

IT standards are generally a beautiful thing, especially in programming, as you can rely on them, knowing that certain rules will always be followed.  CSV files are a prime example of this, and we should be able to assume that any CSV file will contain a list of Comma Separated Values, one record per line, followed by a new line character.  Awesome… until some bright spark decides to inject a line or two of information above the CSV contents which doesn’t contain any commas.  (If you do that, please stop.  It is NOT A GOOD IDEA.)

The Issue in the Real World

If you’d like to follow along, you can click here to download MalformedCSV.csv (the sample file).

If you open the sample file in Notepad, you’ll see that it contains the following rows:

SNAGHTML35a6fc

Notice the first row… that’s where our issue is.  There are no commas.  Yet when you look at the data in the rows below, they are plainly separated by commas.  Well yay, so what, right?  Who cares about the guts of a CSV file?  The answer is “you” if you ever get one that is built like this…

Let’s try importing the sample file into Power Query:

  • Power Query –> From File –> From CSV
  • Browse to MalformedCSV.csv

And the result is as follows:

SNAGHTML390f9a

One header, and lots of errors.  Not so good!

The Source Of The Error

If I click the white space beside one of those errors, I get this:

image

What the heck does that even mean?

Remember that CSV is a standard.  Every comma indicates a column break, every carriage return a new line.  And we also know that every CSV file has a consistent number of columns (and therefore commas) on every row.  (That’s why you’ll see records in some CSV’s that read ,, – because there isn’t anything for that record, but we still need the same number of commas to denote the columns.

And now some joker builds us a file masquerading as a CSV that really isn’t.  In this case:

  • Our first row has no commas before the line feed.  We therefore must have a one column table.  Power Query sets us up for a one column table.
  • But our second row has three commas, which means three columns… That’s not the same number of columns as our header row, so Power Query freaks out and throws an error for every subsequent record.

So Now What?

If we can’t rely on the file format being correct, why don’t we just import it as a text file?  That would allow us to bring all the rows in, remove the first one, then split by commas.  That sounds like a good plan.  Let’s do it.

  • Power Query –> From File –> From Text
  • Browse to the folder that holds MalformedCSV.csv

Uh oh… our file is not showing.  Ah… we’re filtered to only show *.txt and *.prn files…

  • Click the file filter list in the bottom right and change “Text File (*.txt;*.prn)” to “All Files (*.*)”
  • Open MalformedCSV.csv

And the result…

SNAGHTML390f9a

Damn.  See, Power Query is too smart.  It looks at the file and says “Hey!  That’s not a text file, it’s a CSV file!” and then imports it as a CSV file… which we already know has issues.  Grr…

Force Power Query to Import as a Text File

Let’s try this again, this time from scratch.  We need two things here…

  1. The full file path to the file you want to import (including the file extension).  On my system it is “D:\Test\MalformedCSV.csv”
  2. A little bit of a code template, which is conveniently included below.

What we’re going to do is this:

  • Go to Power Query –> From Other Sources –> Blank Query
  • View –> Advanced Editor
  • Paste in the following code

let
/* Get the raw line by line contents of the file, preventing PQ from interpreting it */
fnRawFileContents = (fullpath as text) as table =>
let
Value = Table.FromList(Lines.FromBinary(File.Contents(fullpath)),Splitter.SplitByNothing())
in Value,

/* Use function to load file contents */
Source = fnRawFileContents(“D:\Test\MalformedCSV.csv”)

in
Source

  • Update the file path in the “Source” step to the path to your file.
  • Click Done

And the output is remarkably different… in fact, it’s the entire contents of the file!

SNAGHTML4af3ce

This is awesome, as we now have the ability to clean up our data and use it as we were hoping to do from the beginning!  So let’s do just that…starting with killing off that first line that’s been screwing us up:

  • Home –> Remove Rows –> Remove Top Rows –> 1
  • Transform –> Split Column By Delimiter –> Comma –> At each occurrence of the delimiter
  • Transform –> User First Row As Headers

And now we’ve got a pretty nice table of data that actually looks like our CSV file:

SNAGHTMLa3388b

The final steps to wrap this up would essentially be

  • set our data types,
  • name the query. and
  • load the query to the final destination.

Final Thoughts

This seems too hard.  I know that Power Query is designed for ease of use, and to that end it’s great that it can and does make assumptions about your file.  Most of the time it gets it exactly right and there is no issue.  But when things do go wrong it’s really hard to get back to a raw import format so that we can take control.

I really think there should be an easy and discoverable way to do a raw import of data without making import/formatting assumptions.  A button for “Raw Text Import” would be a useful addition for those scenarios where stuff goes wrong and needs a careful hand.

I should also mention that this function will work on txt files or prn files as well.  In fact, it also works on Excel files to some degree, although the results aren’t exactly useful!

image

The key here is, whether caused by one of more extra header lines in a csv file, tab delimited, colon delimited or any other kind of delimited file, the small function template above will help you get past that dreaded message that reads “DataFormat.Error:  There were more columns in the result than expected.”

Addendum To This Post

Miguel Escobar has recorded a nice video of the way to do this without using any code.  You can find that here:

 

 

Building a Parameter Table for Power Query

One of the things that I’ve complained about in the past is that there is no built in way for Power Query to be able to pull up the path for the current workbook.  Today I’m going to show you how I solved that problem by building a parameter table for Power Query in Excel, then link it into my queries.

Quick Note!  WordPress is determined to drive me crazy and is replacing all of my straight quotes (SHIFT + ‘) with curly quotes.  Since curly quotes are different characters, they will cause issues if you copy/paste the code directly.  I’ll get this sorted out, but in the mean time, just make sure you replace all “curly quote” characters with straight quotes.

To do this successfully, we need to pieces; an Excel table, and a Power Query function.  So let’s get to it.

Building a Parameter Table

The table is quite simple, really.  It’s a proper Excel Table, and it has a header row with the following two columns:

  • Parameter
  • Value

Once created, I also make sure that I go to the Table Tools tab, and rename the table to “Parameters”.

SNAGHTML192eb95

Pretty bare bones, but as you’ll see, this becomes very useful.

Now we add something to the table that we might need.  Since I’ve mentioned it, let’s work out the file path:

  • A8:     File Path
  • B8:     =LEFT(CELL(“filename”,B6),FIND(“[“,CELL(“filename”,B6),1)-1)

Now, as you can see, column A essentially gives us a “friendly name” for our parameter, where the value ends up in the second column:

SNAGHTML195b12d

While we’re here, let’s add another parameter that we might have use for.  Maybe I want to base my power query reports off the data for the current day.  Let’s inject that as well:

  • A9:     Start Date
  • B9:     =TODAY()

SNAGHTML197a4ef

Good stuff.  We’ve now got a table with a couple of useful parameters that we might want when we’re building a query.

Adding the Parameter Function

Next, we’re going to add the function that we can reference later.  To do that:

  • Go to Power Query –> From Other Sources –> Blank Query
  • Go to View –> Advanced Editor
  • Replace all the code with the following:

(ParameterName as text) =>
let
ParamSource = Excel.CurrentWorkbook(){[Name=”Parameters”]}[Content],
ParamRow = Table.SelectRows(ParamSource, each ([Parameter] = ParameterName)),
Value=
if Table.IsEmpty(ParamRow)=true
then null
else Record.Field(ParamRow{0},”Value”)
in
Value

  • Click Done
  • Rename the function to “fnGetParameter”
  • Go to Home –> Close & Load To…
  • Choose to create the connection only, avoiding loading to a table or the Data Model

And just in case you think this means it didn’t work, we expect to see that it didn’t load in the queries window:

image

Making Use of the Parameter Table

Okay, so now that we have this in place, how do we go about using it?

Let’s create a ridiculously simple table:

SNAGHTML1a313c0

Now, click in the table and go to Power Query –> From Table.

We’ll be taken into the Power Query window and will be looking at our very simple data.  Let’s pull the contents of my parameters into columns:

  • Go to Add Column –> Add Custom Column
  • Change the title to “File Path”
  • Enter the following formula: =fnGetParameter(“File Path”)

Check it out!

image

Do you see that?  This is the path to the folder that holds the workbook on my system.  The formula we used in the table retrieves that, and I can pass it in to Power Query, then reference it as needed!

How about we do the same for our date?

  • Go to Add Column –> Add Custom Column
  • Change the title to “First date”
  • Enter the following formula: =fnGetParameter(“Start Date”)

image

The key here is to make sure that the value passed to the parameter function is spelled (and cased) the same as the entry in the first column of the parameter table.  I.e. You could use “FilePath”, “File Path”, “Folder”, “My File Path” or whatever, so long as that’s the name you gave it in the first column of the Parameters Excel table.

And what happens if you pass an invalid value?  Say you ask for fnGetParameter(“Moldy Cheese”) and it’s not in the table?  Simple, you’ll get null returned instead.  :)

Implications of Building a Parameter Table for Power Query

The implications for this technique are huge.  Consider this scenario… you create your workbook, and store it in a folder.  But within that folder you have a subfolder called “Data”.  Your intention is to store all of your csv files in that folder.  And, for argument’s sake, let’s say that it’s a mapped drive, with the path to your solution being “H:\My Solution\”

No problem, you build it all up, and it’s working great.  You keep dropping your text files in the data folder, and you can consolidate them with some M code like this:

let
Source = Folder.Files(“H:\My Solution\Data”),
#”Combined Binaries” = Binary.Combine(Source[Content]),
#”Imported CSV” = Csv.Document(#”Combined Binaries”,null,”,”,null,1252),
#”First Row as Header” = Table.PromoteHeaders(#”Imported CSV”)
in
#”First Row as Header”

Things run along for ages, and that’s awesome, but then you need to go on vacation.  No worries, it’s Power Query and easy to use, you can just get your co-worker to update it… except… on your co-worker’s machine that same drive is mapped not to the H:\ drive, but the F:\ drive.  Doh!

We could recode the path, but what a pain.  So how about we use the parameter table to make this more robust so that we don’t have to?  All we need to do is modify the first couple of lines of the query.  We’ll pull in a variable to retrieve the file path from our parameter table, then stuff that into the file path, like this:

let
SolutionPath = fnGetParameter(“File Path”),
    Source = Folder.Files(SolutionPath & “Data”),
#”Combined Binaries” = Binary.Combine(Source[Content]),
#”Imported CSV” = Csv.Document(#”Combined Binaries”,null,”,”,null,1252),
#”First Row as Header” = Table.PromoteHeaders(#”Imported CSV”)
in
#”First Row as Header”

How awesome is that?  Even better, the SolutionPath shows as a step in the Applied Steps section.  That means you can select it and make sure the value is showing up as you’d expect!

Practical Use Case

Several months back I built a solution for a client where we stored his Power Query solution in a folder, and we had data folders that were created on a bi-weekly basis.  Each of those folders were named based on the pay period end data (in ISO’s yyyy-mm-dd format), and were stored in a path relative to the master solution.

Naturally, I needed a way to make the queries dynamic to the master folder path, as I did the development in my home office, then emailed the updated solution files to him in his New York office.  He had different drive mappings than I do, and his team had different drive mappings than he did.  With half a dozen Power Queries in the solution, having them manually update the code each time a new user wanted to work with the file just wasn’t an option.

This technique became invaluable for making the solution portable.  In addition, by having a formula to generate the correct end date, we could also pull the data files from the right place as well.

I still want a Power Query CurrentWorkbook.FilePath method in M, but even if I get it this technique is still super useful, as there will always be some dynamic parameter I need to send up to Power Query.

I hope you find this as useful as I have.

Combine Multiple Worksheets Using Power Query

In last week’s post we looked at how to combine multiple files together using Power Query.  This week we’re going to stay within the same workbook, and combine multiple worksheets using Power Query.

Background Scenario

Let’s consider a case where the user has been creating a transactional history in an Excel file.  It is all structured as per the image below, but resides across multiple worksheets; one for each month:

image

As you can see, they’ve carefully named each sheet with the month and year.  But unfortunately, they haven’t formatted any of the data using Excel tables.

Now the file lands in our hands (you can download a copy here if you’d like to follow along,) and we’d like to turn this into one consolidated table so that we can do some analysis on it.

Accessing Worksheets in Power Query

Naturally we’re going to reach to Power Query to do this, but how do we get started?  We could just go and format the data on each worksheet as a table, but what if there were hundreds?  That would take way too much work!

But so far we’ve only seen how to pull Tables, Named Ranges or files into Power Query.  How do we get at the worksheets?

Basically, we’re going to start with two lines of code:

  • Go to Power Query –> From Other Sources –> Blank Query
  • View –> Advanced Editor

You’ll now see the following blank query:

let
Source = “”
in
Source

What we need to do is replace the second line (Source = “”) with the following two lines of code:

FullFilePath = “D:\Temp\Combine Worksheets.xlsx”,
Source = Excel.Workbook(File.Contents(FullFilePath))

Of course, you’ll want to update the path to the full file path for where the file is saved on your system.

Once you click Done, you should see the following:

image

Cool!  We’ve got a list of all the worksheets in the file!

Consolidating the Worksheets

The next step is to prep the fields we want to preserve as we combine the worksheets.  Obviously the Name and Item columns are redundant, so let’s do a bit of cleanup here.

  • Remove the Kind column
  • Select the Name column –> Transform –> Data Type –> Date
  • Select the Name column –> Transform –> Date –> Month –> End of Month
  • Rename the Name column to “Date”

At this point, the query should look like so:

image

Next we’ll click the little double headed arrow to the top right of the data column to expand our records, and commit to expanding all the columns offered:

SNAGHTML52dcfe5

Hmm… well that’s a bit irritating.  It looks like we’re going to need to promote the top row to headers, but that means we’re going to overwrite the Date column header in column 1.  Oh well, nothing to be done about it now, so:

  • Transform –> Use First Row As Headers –> Use First Row As Headers
  • Rename Column1 (the header won’t accept 1/31/2008 as a column name) to “Date” again
  • Rename the Jan 2008 column (far right) to “Original Worksheet”

Final Cleanup

We’re almost done, but let’s just do a bit of final cleanup here.  As we set the data types correctly, let’s also make sure that we remove any errors that might come up from invalid data types.

  • Select the Date column
  • Home –> Remove Errors
  • Set Account and Dept to Text
  • Set Amount to Decimal Number
  • Select the Amount column
  • Home –> Remove Errors
  • Set Original Worksheet to Text

Rename the query to “Consolidated”, and load it to a worksheet.

Something Odd

Before you do anything else, Save the File.

To be fair, our query has enough safe guards in it that we don’t actually have to do this, but I always like to play it safe.  Let’s review the completed query…

Edit the Consolidated query, and step into the Source line step.  Check out that preview pane:

image

Interesting… two more objects!  This makes sense, as we created a new table and worksheet when we retrieved this into a worksheet.  We need to filter those out.

Getting rid of the table is easy:

  • Select the drop down arrow on the Kind column
  • Uncheck “Table”, then confirm when asked if you’d like to insert a step

Select the next couple of steps as well, and take a look at the output as you do.

Aha!  When you hit the “ChangedType” step, something useful happens… we generate an error:

image

Let’s remove that error from the Name column.

  • Select the Name column –> Home –> Remove Errors

And we’re done.  We’ve managed to successfully combine all the data worksheets in our file into one big table!

Some Thoughts

This method creates a bit of a loop in that I’m essentially having to reach outside Excel to open a copy of the workbook to pull the sheet listing in.  And it causes issues for us, since Power Query only reads from the last save point of the external file we’re connecting to (in this case this very workbook.)  I’d way rather have an Excel.CurrentWorkbook() style method to read from inside the file, but unfortunately that method won’t let you read your worksheets.

It would also be super handy to have an Excel.CurrentWorkbookPath() method.  Hard coding the path here is a real challenge if you move the file.  I’ve asked Microsoft for this, but if you think it is a good idea as well, please leave a comment on the post.  (They’ll only count one vote from me, but they’ll count yours if you leave it here!)

Merge Multiple Files With Properties

This post illustrates a cool technique that I learned at the MVP summit last week, allowing us to use Power Query to merge multiple files with properties from the file in the output.  A specific example of where this is useful is where you have several files with transactional data, saved with the month as the file name, but no date records in the file itself.  What we’d want to do is merge all the contents, but also inject the filename into the records as well.

The funny thing about this technique is that it’s eluded me for a long time, mainly because I way over thought the methods needed to pull it off.  Once you know how, it’s actually ridiculously simple, and gives us some huge flexibility to do other things.  Let’s take a look at how it works.

If you’d like to download the files I’m using for this example, you can get them here.  You’ll find that there are 3 files in total: Jan 2008.csv, Feb 2008.csv and Mar 2008.csv.

Step 1: Create your query for one file

The first thing we need to do is connect to the Jan 2008.csv file and pull in it’s contents.  So let’s do that:

  • Power Query –> From File –> From CSV
  • Browse to the Jan 2008.csv file and import it
  • Rename the “Sum of Amount” column to “Amount”

Perfect, we now have a basic query:

SNAGHTML63ab03

Notice here how the January file has no dates in it?  That’s not good, so that’s what we’re trying to fix.

Step 2: Turn the query into a function

At this point we need to do a little code modification.  Let’s go into the Advanced editor:

  • View –> Advanced Editor

We need to do two things to the M code here:

  • Insert a line at the top that reads:  (filepath) =>
  • Replace the file path in the Source step (including the quotes) with:  filepath

At that point our M will read as follows:

image

We can now:

  • Click Done
  • Rename our Query to something like fnGetFileContents
  • Save and close the query

Power Query doesn’t do a lot for us yet, just giving us this in the Workbook Queries pane:

image

Step 3: List all files in the folder

Now we’re ready to make something happen.  Let’s create a new query…

  • Power Query –> From File –> From Folder
  • Browse to the folder that holds the CSV files
  • Remove all columns except the Name and Folder Path

Wait, what?  I removed the column with the binary content?  The column that holds the details of the files?  You bet I did!  You should now have a nice, concise list of files like this:

image

Next, we need to add a column to pull in our content, via our function.  So go to:

  • Add Column –> Add Custom Column
  • Enter the following formula:  =fnGetFileContents([Folder Path]&[Name])

Remember it is case sensitive, but when you get it right, some magic happens.  We get a bunch of “Table” objects in our new column… and those Table objects hold the contents of the files!

I know you’re eager to expand them, but let’s finish prepping the rest of the data first.

Step 4: Prep the rest of the data you want on the table rows

Ultimately, what I want to do here is convert the file name into the last day of the month.  In addition, I don’t need the Folder Path any more. So let’s take care of business here:

  • Remove the “Folder Path” column
  • Select the “Name” column –> Transform –> Replace Values –> “.csv” with nothing
  • Select the “Name” column –> Transform –> Date Type –> Date
  • Select the “Name” column –> Transform –> Date –> Month –> End of Month

And we’ve now got a pretty table with our dates all ready to go:

image

Step 5: Expand the table

The cool thing here is that, when we expand the table, each row of the table will inherit the appropriate value in the first column.  (So all rows of the table in row 1 will inherit 2/29/2008 as their date.)

  • Click the little icon to the top right of the Custom column
  • Click OK (leaving the default of expanding all columns)
  • Rename each of the resulting columns to remove the Custom. prefix

And that’s it!  You can save it and close it, and it’s good to go.

A little bit of thanks

I want to throw a shout out to Miguel Llopis and Faisal Mohamood from the Power Query team for demonstrating this technique for the MVP’s at the summit.  I’ve been chasing this for months, and for some reason tried to make it way more complicated than it needs to be.

What’s the next step?

The next logical step is to extend this to working with Excel files, consolidating multiple Excel files together; something we can’t do through the UI right now.  Watch this space, as that technique is coming soon!

Name Columns During a Merge

The timing for the release of the October Power Query update couldn’t really have been much better for me, especially since it’s got a cool new feature in that you can now name columns during a merge.  Why is this timing so cool?

The reason is that I’m at the MVP Summit in Redmond this week, an event where the MVP’s and Microsoft product teams get together to discuss the things that are/aren’t working in the products, and discuss potential new features.  The tie in here is that I already blogged about the new method for Merging Columns With Power Query that came into effect with the August update and, in that blog post, I said:

I do wish this interface had a way to name the column in advance (like exists when you create a custom column.)  Hopefully the PQ team will retrofit us with that ability at some point in the future.

The October Power Query update (released Oct 27, 2014), includes the ability to name columns during a merge, rather than making you do this in two steps.  How cool is that?

image

While I’m under no illusions that this change was made based on my feedback alone, I do know that the Power Query team reads my blog, so it played a part.  This is one of the big reasons I go to the summit every year, to share ideas that make the product more intuitive/usable, so it’s cool to see one of those changes make it into the product a few days before I arrive.

By the time this post is published on Wednesday morning, we’ll already be into our final day of the 2014 Summit.  I’m going in pretty jazzed though, as I know that the product teams are listening and reacting to our feedback.  :)

Tame Power Query Workbook Privacy Settings

I recently built a cool Excel solution at work that uses Power Query to reach out and grab some weather data, then upload it into database.  We use that data in our dashboards, as weather is pretty important to the golf industry in which I work.  But then I went to deploy the file, and needed to find a way to tame the Power Query Workbook Privacy settings.

The Issue

What happens is, every time a user runs a Power Query that was last saved by another user, they are prompted to set the Workbook’s Privacy level.  This is maddening, as we have two staff that use this workbook, where one covers for the other while they’re away.  Naturally, long lapses of time can occur in between… just long enough to forget what to do when you’re prompted by this frustrating message:

image

So while I can (and have) set the privacy level for the web data they are going to retrieve (see my tool here), I have no way to permanently set the Workbook’s Privacy level.  Worse is that, if the user clicks Cancel, or chooses the wrong privacy level, (even trying to protect the output table structure using Chris Webb’s technique here,) fails.  The table generates an error, and all business logic in the workbook is blown apart.  The only recourse is to exit the file without saving and try again.

Naturally, this concerns them, and they call to make sure they do the right thing.  That’s awesome, and I wouldn’t change that at all.  But the deeper underlying problem is that Power Query’s Workbook security is engineered to require developer interaction.  And that is bad news!

How to Tame Power Query Workbook Privacy Settings

Unfortunately I can’t use VBA to set this in the workbook (or at least, I haven’t tried, anyway), but I can work a little trick to at least warn my users when they are about to be prompted, and remind them which privacy level they need to select.  Here’s what I need to do in order to make that work:

Step 1: Create a range to hold the last user

  • Create a named range somewhere in the workbook called “rngLastUser”  (I put mine on the “Control Panel” worksheet and hid it.)
  • Drop the following code in a Standard VBA Module:

Public Sub UpdateLastUser()
With This
Workbook.Worksheets(“Control Panel”)
.Range(“rngLastUser”) = Application.UserName
End With
End Sub

Step 2: Create a macro to update your Power Queries

Public Sub GetWeather()
With ThisWorkbook
‘Check if privacy level will need to be set
If .Worksheets(“Control Panel”).Range(“rngLastUser”) <> Application.UserName Then
MsgBox “You are about to be prompted about Privacy Levels” & vbNewLine & _
“for the Current Workbook. When the message pops up, ” & vbNewLine & _
“you’ll see an option to ‘Select’ to the right side of the Current Workbook.” & vbNewLine & _
vbNewLine & _
“Please ensure you choose PUBLIC from the list.”, vbOKOnly + vbInformation, _
“Hold on a second…”
End If

‘Refresh the Power Query table
.Worksheets(“Weather”).ListObjects(“WeatherHistory”).QueryTable.Refresh BackgroundQuery:=True
End With
Call UpdateLastUser
End Sub

Step 3: Link the macro to a button for your users

  • Link my GetWeather() routine to a button
  • And we’re good!

What Did We Do?

So basically, what I did here was this:

  • Every time the user clicks the button…
  • Excel checks the contents of rngLastUser to see if the username is the same as the current user
    • If it is, it just goes on to refresh the table
    • If it’s not, it kicks up the following message:

image

    • After the user clicks OK (if necessary), then it prompts the user to set the security level. Yes, they can still get it wrong, but at least they have a chance now!
    • Once the security level is set, the macro goes on to refresh the table
  • After the table is refreshed, Excel updates the rngLastUser cell to the name of the current user.

And that’s it.  We now have a system that will prompt our users with the correct answer, so that they don’t have to come back and ask us every time.

Thoughts On The Security Model

Ideally it would be nice to not have to do this, and there is – in fact – a way.  Microsoft’s answer is “Yes, just enable Fast Combine.”  That’s great and all, but then it ignores all privacy levels and runs your query.  What if, as a developer, I need a way to ensure that what I’ve built stays Public, Private or Organizational?  What it if actually matters?

To me, Fast Combine is akin to setting Macro Security to Low in Excel 2003 and earlier.  Basically, we’re saying “I don’t like nagging messages, so let’s run around in a war zone with no bullet proof vest.”  Sure, you might come out alive, but why should you take that risk?

In my opinion, the Power Query security model needs work.  Even if we could assign a digital certificate to the query to prove it (and the privacy levels) had not been modified, that would be great.  I’m not sure exactly what it is we need, but we need something better than what we have today.

Refresh Power Query With VBA

When I’ve finished building a solution in Excel, I like to give it a little polish, and really make it easy for my users to update it.  The last thing I want to do is to send them right clicking and refreshing everywhere, so I program a button to refresh Power Query with VBA.

The interesting part about the above statement is that Power Query doesn’t have any VBA object model, so what kind of black magic trick do we need to leverage to pull that off?  As it turns out, it’s very simple… almost too simple in fact.

A Simple Query

Let’s just grab the sample data file from my post on pulling Excel named ranges into Power Query.  Once we’ve done that:

  • Click in the blue table
  • Go to Power Query –> From Table
  • Let’s sort Animal ascending (just so we know something happened)
  • Next save and Exit the query

At this point, we should get a new “Sheet2” worksheet, with our table on it:

SNAGHTML34cf68

 The Required VBA Code

Next, we need to build our VBA for refreshing the table.  Rather than record and tweak a macro, I’m just going to give you the code that will update all Query Tables in the entire workbook in one shot.  But to use it, you need to know the secret handshake:

  • Press Alt + F11

This will open the Visual Basic editor for you.  If you don’t see a folder tree at the left, then press CTRL+R to make it show up.

  • Find your project in the list (It should be called “”VBA Project (Selecting Data.xlsx)”
  • Right click that name and choose “Insert Module”
  • In the window that pops up, paste in the following code:

Public Sub UpdatePowerQueries()
‘ Macro to update my Power Query script(s)

Dim cn As WorkbookConnection

For Each cn In ThisWorkbook.Connections
If Left(cn, 13) = “Power Query -” Then cn.Refresh
Next cn
End Sub

Now, I’ll admit that I find this a little looser than I generally like.  By default, all Power Query scripts create a new connection with the name “Power Query –“ the name of your query.  I’d prefer to check the type of query, but this will work.

Speaking of working, let’s prove it…  But first, close the Visual Basic Editor.

Proving The Refresh Works

The easiest way to do this is to go back to the table on Sheet 1 and add a new row to the table.  I’m going to do that first, then I’m going to:

  • Press Alt + F8
  • Choose “UpdatePowerQueries”
  • Click Run
  • Go back to Sheet2 to verify it’s updated

If all goes well,  you should now have another row of data in your table, as I do:

image

Adding Polish

Let’s face it, that’s probably harder than going to Data –> Refresh All.  The goal here was to make it easier for my users.  So let’s do that now.

  • Return to Sheet 1
  • Go to the Developer Tab (if you don’t see it, right click the ribbon, choose “Customize Ribbon” and check the box next to the Developer tab to expose it)
  • Click Insert and select the button in the top left

image

  • Left click and drag a button onto your worksheet

When you let go, you’ll be prompted to assign a macro.

  • Choose “UpdatePowerQueries” and click OK
  • While the dots are still on the corners, click in the text
  • Backspace it out and replace it with something helpful like “Update Queries” (if you click elsewhere, you’ll need to right click the button to get the selection handles back.)
  • Click in the worksheet to de-select the button

SNAGHTML57873a

That’s it.  Test it again by adding some more data to the table then clicking the button.

Ramifications

I write a lot of VBA to make my users lives easier, and generally use this kind of technique as part of a bigger goal.  But regardless, it can still be useful as a stand alone routine if you want to avoid having to train users on how to do things through the ribbon.

Create Dynamic Table Headers With Power Query

In my last post, I looked at how to pull a named range into Power Query and turn it into a table of data.  Today we’re going to look at how to create dynamic table headers with Power Query, but using a slightly different, slightly more complicated way to do the same task.  Why?  Well, the reality is that sometimes the simple method won’t work for us.

Background

In this scenario, we’re going to use the same file I used in my last post, we’re just going to build the output differently.  The key that I’m after here is that I want to pull the data range into Power Query, but I want to use a table.  The issue, however, as I mentioned in my last post, is:

If I set up the table with headers in row 3, it would convert the dates to hard numbers, thereby blowing apart the ability to easily update the dynamic column headers next year.  That would defeat the purpose of making the input sheet dynamic in the first place.

I’ve really struggled with this feature in Tables, where it converts your headers to hard values.  So many of the tabular setups I create use dynamic headers, it’s actually more rare that they don’t.  So how do we work around this?  How do we create dynamic table headers in Excel?

Setting Up The Table For Success

It’s actually a LOT easier than you might think.  I use this very simple trick with both PivotTables and regular Tables, allowing me to take advantage of their power, but still control my headers and make them dynamic.  Here’s how:

Step 1: Add Static Headers Manually

  • Insert 2 rows below the current dynamic date headers in Row 3
  • Put some static headers in Row 5 that are generic but descriptive.  (In this case, CYM# means “Current Year, Month #”)

SNAGHTML1a520e81 The important part here is to make sure these static headers are each unique so that you can “unwind” them later with Power Query.

Step 2: Create The Table

Next, we need to create the table.  It’s going to cover A5:N24, and will therefore inherit the CYM column headers.  Since they are static values, it won’t make any changes to them, and my dynamic dates are still showing up top.

Step 3: Build a Translation Table

Huh?  A what?  Bear with me, as this will come clear a bit later.  Here’s how we do it:

  • Enter the following in B4:  =B3
  • Copy this across to cover B4:N4
  • Make sure B4:M4 is selected and go to Home—>Editing—>Find & Select –> Replace
  • Using the dialog, make the following two replacements:
    • Replace = with =$
    • Replace 3 with $3

This has the effect of making the formulas all absolute, which is important for our next step.

  • Select B4:N5 –> Right Click –> Copy
  • Select a cell down below your data range somewhere (I used A34)
  • Right click the cell and choose Paste Special
  • In the Paste Special options, choose the following:
    • Paste:  All
    • Check the Transpose box

Very cool, we’ve now got the beginnings of a table that is linked to the formulas in row 3.  We just need to finish it.

  • Add the column header of “Date” in A33
  • Add the column header of “Period” in B33
  • Format the data range from A33:B46 as a table
  • Rename the table to “DateTranslation”

SNAGHTML1a6575f9

Step 4: Final Header Cleanup

Row 4 has now served it’s purpose for us, so you can delete that, and then hide (the new) row 4.  The end result is that we have a header on our table that looks like it’s part of the table, but isn’t really.  The benefits here are that we can preserve the dynamic nature of it, and we have a wider variety of formatting options. SNAGHTML1a67304b We also have a completely separate date translation table too…

Setting Up The Power Query Scripts

So now we need to get the data into Power Query.  Let’s do that.

Step 1: Import and Reformat the Rounds Table

To do this we will:

  • Click somewhere in the main table
  • Power Query –> From Table
  • Remove the “TOTAL” column
  • Filter the first column to remove text that begins with “Total” and values that equal null
  • Filter the second column to remove null values
  • Right click the first column and Un-Pivot Other Columns
  • Rename the “Month” column to “Round Type”
  • Rename the query to “Budget”

At this point, you should have the following: SNAGHTML1afa967c This is great, but what about the Attribute column.  This is the whole big pain here about using a table, in that we don’t have dates.  Yes we could have hard coded them, but then it would be very painful to update our query when our year changes.  So while we have something flexible (in a way) here, it isn’t really all that readable.  How can we change that? Save and close the query, and let’s deal with this.

Step 2: Create Another Power Query

Let’s add the other table we built:

  • Click inside the DateTranslation table we created
  • Go to Power Query –> From Table
  • Click Close & Load –> Load To…
  • Click Only Create Connection –> Load

This will create a new Power Query that basically just reads your original table on demand.  It won’t refresh unless it’s called from another source.  Now we’re set to do something interesting.

Step 3: Re-Open The Budget Power Query

If the Workbook Queries pane isn’t open on the right, then go to Power Query –> Workbook Queries to show it.

  • Right click the Budget query and click Edit
  • On the Power Query Home tab, click Merge Queries (in the Combine group)
  • Select the Attribute column
  • From the drop down box, choose Date Translation
  • Select the Period column
  • Make sure “Only Include Matching Rows” is checked and click OK

image At this point you’ll get a new column of data in your Query.  Click the icon in the top right of the column to expand it: image Excellent… now we only need to pick the column we need, which is Date.  So uncheck the Period column and click OK. Finally we can remove the Attribute column and rename the “NewColumn.Date” column to Date and we’ve got a pretty clean query: SNAGHTML1b05acce At this point we could call it a day, as we’ve pretty much accomplished the original goal.  I can update the Year cell in B1 and my Table’s “Headers” will update.  In addition, my Power Query will show the correct values for the dates as well.  Pretty cool, as I could now link this into Power Pivot and set a relationship against a calendar table without having to worry about how it would be updated.

Going One Level Deeper

One thing I don’t like about this setup is the need for the extra query.  That just seems messy, and I’d prefer to see just one query in my workbook.  The issue though, is that I’m pulling data from two tables.  What’s cool though, is that with a little editing of the M code, I can fix that. Here’s the M for the query I’ve built, with a key line highlighted (twice): image As you can see, the “Merge” line is coloured yellow, but the name of the Query being merged is in orange.  Well guess what, we don’t need to reach to an external query here, we can reach to another named step in M.  Try this: Immediately after the “Let” line, enter the following:

TranslationTable = Excel.CurrentWorkbook(){[Name=”DateTranslation”]}[Content],

Now, modify the “Merge” line to update the name of the table from “DateTranslation” to “TranslationTable”.  (The reason we’re doing this is that the original query still exists, so we can’t just name the first step “DateTranslation”, as it will conflict. Once we’ve made our modifications, the script will look as follows: image When you click “Done”, the query will reload and you’ll see an extra step in the “Applied Steps” box on the right.  What you won’t see though, are any changes, as the data comes out the same.  Very cool, as we are now referencing both tables in a single step.  To prove this out, save this query, drop back to Excel and delete the “DateTranslation” query.  It will still work! (The completed file can be downloaded here.)

Ending Thoughts

I really like this technique.  It let’s me dynamically change the column names, yet still use those to link them into my data model tables.  But even more I like the ability that, with a minor edit to the M code, I can keep my workbook from being littered with extra queries.  :)

Using Non-Contiguous Data Ranges in Power Query

One of the methods we use when building business solutions is creating specific input sheets for our models, as it separates our data from our business logic and ultimately our reporting layers.  This strategy is a key piece of building stable models, and is one of the fundamental things I teach when I’m leading modelling courses.  (For more information on having me at your site, click here.)  Since many data entry points consist of non-continguous ranges in Excel, it make sense to look at how using non-contiguous data ranges in Power Query can help us in our modelling.

To that end, in today’s blog post, I’m going to look at a technique to take a standard data entry worksheet and turn it into a data source, which can then be linked in to the model (or just used as the basis of reports.)  This is ideal, as we then only have one place to update our data for our solution.

Background:

What we’re going to start with is this:

image

A few key things you might want to know are:

  • You can download the sample file from this link if you want to follow along.
  • This is hypothetical budget data for golf rounds.
  • All the data entry cells are light green (I tell my users “Green means go” and they are now conditioned to stay away from anything that isn’t that light green colour.
  • The year in cell B1, when changed, will update all of the headers on the input sheet to the current year

Now, what I’d like to do is un-pivot this data into a useable table.  If I could do that, then I’d be able to use it in any of many methods, such as PivotTable, PivotCharts, charts or VLOOKUP solutions.

The initial setup

Your first temptation might be to layer a table over the data.  I’ve got a few reasons why I don’t want to do that:

  1. If I did layer a table over my input form it would lose the intuitive “enter data here” setup that I worked so hard to create.  Users would end up with a banded table that didn’t indicate where data should go.  That’s not ideal.
  2. If I set up the table with headers in row 3, it would convert the dates to hard numbers, thereby blowing apart the ability to easily update the dynamic column headers next year.  That would defeat the purpose of making the input sheet dynamic in the first place.
  3. If I set up the table with headers above that, I’d end up with a bunch of ugly Column1, Column2 headers.  I suppose I could hide them, but again, my table would blow apart my formatting as mentioned in 1.

So no… that’s not what I’m after.  So now what…?

There’s actually a few different ways to handle this.  I’ll look at some others in later blog posts, but for now let’s piggyback on the technique about using named ranges that I covered in my last post.

Creating the Named Range

The first thing I’m going to do is create a named range (not a Table) to cover the entire range that holds my budget data: A3:N22.  That includes the headers, all blank rows ,and all data right up to the totals on the right and bottom.  The reason for this is that, should I end up adding any new categories later by inserting rows or columns, it’s fairly likely that I’ll be doing it before the totals, so they’ll get picked up.  (It’s not quite the auto-expansion feature of a table, but it’s the best I’m going to get with a named range.)

So I created my named range and give it a sensible name like rngBudgetData.

Pulling the data into Power Query

Now let’s look at the easiest way to get our named range into Power Query.

If we click a single cell in our data range, then told Power Query to get the data from a Table, we’d be given a range for that expanded to cover only the contiguous block of data.  Because we’ve got blank rows in our data, that wouldn’t really work for us.  For example, assume we click B3, then tell Power Query to pull data From Table.  We’d be given the range A12:N15, as shown below.

image

Alternately, we could create a blank query, then type =Excel.CurrentWorkbook() in the formula bar, and choose our table, as outlined in the last blog post:

image

Okay, so that WOULD work.  As it turns out though, there is still yet another way to do this:

  • From the Name drop down in Excel, select the rngBudgetData named range

SNAGHTML3169d992

  • A3:N22 will now be selected, so go to Power Query –> From Table

And look at that, we’re now in Power Query using our named range!

The secret to this is, in order for Power Query to pull in the named range, the entire named range must be selected.  If that happens, Excel will grab it as the Power Query source.  If you’re using named ranges that cover contiguous blocks of data, and you click somewhere in that block of data, then Power Query very well may grab your entire named range.  So long as the auto selection boundaries line up with your named range you should be good.  But in the case of my data, those blank rows blow that apart, which is why I selected it from the name drop-down first.

(It is worth noting that if a named range and a table’s boundaries match exactly, the Table will be used as the Power Query source, not the named range.)

Manipulating the data in Power Query – Issue 1

Upon pulling the data into Power Query, we end up with a table like this:

SNAGHTML2d2a36c

Naturally, the first thing we want to do is promote the first row to headers, so that we can get to un-pivoting the data.  But look what happens when we do:

SNAGHTML2d3d005

What the heck?  The first column was renamed to Month (from the first row of data), but the remaining didn’t change! Not only that, but we lost our date time stamp.  That’s not good.

So it appears that Power Query can’t promote a date/time into a column header.  We’d better remove that step and try another approach.

Removing Irrelevant Data – Cut 1

Before I get into dealing with the harder stuff, I’m going to do a quick bit of cleanup here.  I’ve got some bank rows of data in my table, so I’m going to knock those off first.  To do that I:

  • Filter Column1 and uncheck (null) values

I’ve also got some header rows, which I can identify by the null values in Column2-Column14.  I can knock those off as follows:

  • Filter Column2 and uncheck (null) values

Easy enough so far.  I also don’t need the Total column, so I’ll scroll all the way to the right and:

  • Right click Column14
  • Choose Remove

Now I need to get rid of all the rows that contain totals (as I can always re-create those with a Pivot Table:

  • Filter Column1
  • Text Filters –> Does Not Contain –> “Total “ –> OK

Now, you’ll notice that I used Total with a space.  This is just to make sure that I don’t accidentally remove a row that contains a word which has total in it – like “totally”.  It’s always safer to make your text pattern as exact as possible.

After doing all of the above, I’m left with this:

SNAGHTML2e492f9

So the data is all clean, but I still can’t promote my headers so I can un-pivot this data.  Now what?

Busting Out Transpose

Transpose is quickly becoming one of my favourite functions.  Let’s hit it now:

SNAGHTML2e6791e

Hmm… originally I was going to convert my dates to text, Transpose it back, promote the text to headers, then un-pivot it.  But I don’t think I even need to do that at all.  Try this:

  • Transform –> Use First Row as Headers
  • Select the Month column
  • Transform –> Unpivot Columns –> Unpivot Other Columns

Bingo!

SNAGHTML2e8de09

Now just for the final cleanup:

  • Select the Month column –> Transform –> Data Type –> Date  (You need to do this, or Excel will return date serial numbers to the table, not dates formatted as dates.)
  • Rename the Attribute column to “Round Type”
  • Rename the Value column to “Amount”

Finally we can choose to Close and Load the Power Query and it will turn it into a nice Excel table.

Some Thoughts

As I was developing this blog post I found about 10 different ways through this process to end up at the same goal.  This is actually one of the things I enjoy about Power Query is the creativity that you can employ coming out to the same end result.

One feature I do hope to see eventually is the ability to filter for text that “starts with” or “does not start with”.  That would allow me to make my text searches even tighter than the “contains” portion I used here.  In one version I actually wrote an if function to do that:

=if Text.Start([Column1],5)=”Total’ then “remove” else “keep”

The loss of the months when originally promoting the header row shocked me a bit.  I would have expected this to be automatically converted to text and those values used.  Alas that’s not the case, so we have to do a bit more work.  Certainly not the end of the world, but not what I’d expect to see.

At the end of the day though, it’s nice to know that there is a way to get useful non-Table data into Power Query and turn it into something useful.  :)