How do You Design Your Data Model?

On this blog, I showcase a lot of different techniques for manipulating and reshaping data.  For anyone that follows the blog, you already know this, and you know it's a pretty important topic to me.  But the thing we shouldn't lose site of is WHY we do this. It's to drive analytics. I'm fairly convinced that the majority of the loyal readers here already know this. Thus, I wanted to ask your opinion on something...

Raw data to data model, passing through the human brain

How do you design your data model?

What I'm specifically interested in is how you approach designing the Fact and Dimension tables you use for your Power Pivot model.  And I'm not specifically talking about Power Query here. We all know you should be using what you learned from our recently relaunched Power Query Academy to do the technical parts.  😉

What I'm more interested in is the thought process you go through before you get to the technical bit of doing the data reshaping.

If you read books on setting up a data model, you'll probably be told that you need to do the following four steps:

  1. Identify the business process
  2. Determine the grain of the model
  3. Design your Dimension tables
  4. Design the Fact tables

So if you're asked "how do you design your data model", do these steps resonate with you, and why?

Do you consciously sit down, and work through each of these steps in order?  I suspect that many self-service BI analysts skip the first step entirely as they are implicitly familiar with their business process.  (As a consultant, I ask a lot of questions in this area to try and understand this before building anything.)

Do you design the reports on paper, then work backwards to the data you'll need, go find it and reshape it?  Or do you go the other way, trying to collect and reshape the data, then build reports once you think you have what you need?

Do you explicitly define the model grain?  And if you do, what does that mean to you?  Is it restricted to "I want transactions at an monthly/daily/hourly basis"? Or do you do deeper like "I want transactions at a daily basis and want to break them down by customer, region and product"?

Why the question?

There's actually two reasons why I'm asking this question:

Reason 1 is that I'd I think healthy discussion makes all of us better.  I'd like to hear your thoughts on this as I'm probably going to learn something that I haven't discovered in my own learning journey.

Reason 2 is that my whole business is around teaching people how to do these things, and I'm always looking to make things clearer.  The more opinions I hear (even if they contrast with each other), the more I can help people understand his topic.

So sound off, please!  We'd all love to hear how you approach the task of building a data model.

Do Data Types Matter in Power Query?

One of the things that I find a bit misunderstood in Power Query is whether or not Data Types matter. I mean, I’m sure everyone agrees to some point that they do, but you do know just how much they matter, and why?

Over the weekend, I received an email from one of our loyal students which read, in part:

I am going under the assumption that in Power BI nothing is free. Steps, calculated columns, measures and so on could be cheap but they are never free. It is with this premise in mind that I pose the following theory.

Users should sparingly use Change Type in Power Query. It is fine to Change Type when converting Date/Time to Date, Decimal Number to Whole Number, and others that actually change the value. It is a waste of resources to Change Type from Decimal or Whole Number to Currency. Even if you want the column to look like Currency, you can apply that format in the Data Model view and save one Power Query step.

On the face, this theory seems somewhat reasonable. After all, adding additional steps is bound to add some overhead to the Power Query process in most cases. And let’s be honest, in small models, it may make no difference to you at all. But when things get bigger…

Data Types vs Formats

To understand what Data Types matter, we need to get something very clear right off the bat: Data Types and Formats are not the same thing. Data Types dictate what kind of data you have and determine how much memory is allocated to store a value. Formatting, on the other hand, tell you how you want the values to appear. To see this in practice, have a look at the following values in Power Query, where the Data Type has been set to Currency:

Column set to Currency Data Type

Notice that they only place you see a $ sign is in the header. And see how the decimal numbers do not line up? I can tell you from many years of teaching accountants, that this drives them bonkers. That 1.7 needs to be 1.70! But you don’t do this here, you do that in the Excel worksheet, Power Pivot model or Power BI visual.

They key to remember here:

  • In Power Query, you define Data TYPES
  • In Power Pivot, Excel or Power BI, you define Data FORMATTING

Excel’s Data Types

In the classic days of Excel, we only had four data types that we had to be worried about. Those were:

  • Text,
  • Numbers,
  • Blanks, and
  • Errors

(Although represented by textual patterns like #REF!, #N/A, #NAME?, and the like, they actually count as a different data type.) Today it gets a bit more complicated with Stock and Geography data types, as I blogged about here, but anything else was just a number that was formatted with a number formatting string. Examples of these include:

  • Date: format it as a number, and you’ll see the number of days since Jan 1, 1900)
  • Time: a little trickier, but this is just a fraction of a day where .25 is 6:00 AM, .5 is 12:00 PM and .75 is 6:00 PM. And if all you have is the time, and you format it as a DateTime, you get that time on Jan 1, 1900.
  • True/False (Boolean): while this shows in the cell as TRUE/FALSE, which looks like text, if you multiply them by 1 you’ll see that TRUE equates to 1, where FALSE equates to 0.

But that was Excel, where data types and number formats where the same thing. That’s not Power Query.

Power Query’s Data Types

Unlike Excel, which has a pretty short list of data types, Power Query seemingly has a ton:

  • Numeric Data Types:
    • Decimal Number
    • Currency (Fixed Decimal)
    • Whole Number
    • Percentage
  • DateTime Data Types:
    • Date/Time
    • Date
    • Time
    • Date/Time/Zone
    • Duration
  • True/False (Boolean)
  • Text
  • Others (Binary, Tables, Records, Lists, and more)
  • Any (the dangerous “undefined” type which allows the application to determine the correct data type)

The key to recognize, is that each of these data types is DISTINCT, meaning that each of these is different in some way from every other. (While we won’t get into it in this post, unlike Excel which implicitly converts data from one type to another, Power Query requires explicit type conversion via use of functions like Number.From() or Date.ToText(), which can be frustrating at times!)

For the purpose of this post, however, I want to focus on the first three numeric types: Decimal Number, Currency and Whole Number, and ask the question: Does the Data Type matter?

Illustration Background

Let’s have a look at an example. For the illustration, I set up a Power Query chain that looks like this:

Viewing our Query Chain

The Data table had 100,000 randomly generated [Sales] records that vary between 1.27317262341058 and 100000.017761279, and randomly generated [Customers] that vary between 1 and 1000.

The only change I made in the Whole, Currency and Decimal types was to set the Data Type for the [Sales] column accordingly. In the Customers table, I removed the [Sales] column and duplicates from the Customers column.

I then created 3 quick measures to sum the column up, and dropped them on a Pivot:

Summing up our 3 measures

I don’t think it would surprise anyone here that the [Whole Sum] is slightly different than the [Currency Sum] and [Decimal Sum]. After all, the numbers were rounded at the source before being added up. And so far, we can see that the [Currency Sum] and [Decimal Sum] look pretty much the same. At least until we expand them a bit:

Decimal differences between Currency and Decimal Data Types

The only thing that should be a surprise here is that currency only holds up to four decimals, not two as most people expect. This actually makes sense when you start thinking about foreign exchange transactions, and how they are always carried to four decimal places.

But is that it? Is a Data Type just about rounding? As it turns out the answer to that is both yes and no.

Testing the Model Memory

The next step here was to test the model memory and see how Power Pivot is storing the data. To that end, here’s a little table that shows exactly that:

Table illustrating how Power Pivot is storing the data

Before we get into this, I want to call out something important here. The Data Type that is showing has been read from Power Pivot. Notice that Whole Number shows as such, consistent with Power Query. But the Currency and Decimal tables both show Decimal. As it turns out, Power Pivot doesn’t make a distinction between these two data types. However, the distinction between these two Data Types matters to you anyway, as I’ll explain.

So, what does that all mean?

In the Whole Number table, I had rounded off all the decimals. This left 63,815 unique values. And because Power Pivot compresses based on unique values, it deals with this column very well, resulting in a total of 260.54 KB to store these values.

In the Currency table, I effectively rounded all the values off to four decimal places. This left 99,996 unique values in the table (only 4 values were repeats). Despite this, Power Pivot did a good job of compressing the values in memory, resulting in 390.75 KB to store them.

Then we get to the Decimal column. There are only 4 more unique values than in the Currency column, but the memory takes a colossal 5,234.47 KB to store the values, vs the 390.75 KB of Currency. What the heck is going on?

The answer lies in the fact that Power Pivot has to carry all of those decimal places, and once it does, it can flip to storing data using Hash Encoded memory. That’s a deep topic for another time but suffice it to say that this is a bad thing, as Value encoding is much more efficient. (Hash is generally used for Text, and Values for… you guessed it… values!)

Interestingly, if you round the Decimal Number to 5 decimals you end up with 99,999 unique values in the column and a very minor change to the memory used. But if you round it to 4 decimals, the memory of the Decimals column compresses the same as Currency, and the memory need drops to the same 390.75 KB.

Table illustrating memory requirements of the different Data Types

Why Model Memory Matters

And this is the final part of the equation to me. Not all time is created equal. My users will accept a 2-minute refresh of the data model. They might say it takes time, but +/-30 seconds in a refresh isn’t anything that they’ll get overly concerned with. They’ll click Refresh, go grab a coffee, then come back to work with the file.

But if they click a slicer and it takes 10 seconds to redraw the Pivot Table or Power BI visuals? Watch out! They’ll claim the model is too slow, ineffective, wrong and useless. I’d MUCH rather push resource consumption into the initial refresh in order to build a memory-efficient model that performs well when being used for analysis.

What, wait happened to the Data Types?

To be fair, Power Pivot’s compression mechanism is more about unique values and the length of precision than it is about Data Types. But it’s up to you to choose the correct Data Type to future-proof your model and make sure that the compression algorithms can be applied.

But due to the way Excel has always worked, the way the Data Types are named, and the fact that most modellers don’t have a clear understanding of Formatting vs Data Types… users are more likely to pick Decimal over Currency. I mean, why would I ever format my units as Currency? (In Power BI this is called a Fixed Decimal, although it still shows with the currency symbol.)

We need to recognize that lagging decimals really do happen in business. Let’s look at that Units column for a second. Naturally we never sell a partial unit… or do we? I’ll give you 3 examples of things I’ve seen in the Food and Beverage industry that forever changed my opinion on this:

  1. We had a couple of customers who insisted that they be able to purchase half a muffin. I’m not even kidding here. The killer for me was that it only showed up every 10,000 rows of transactions or so, meaning that the column often got set to Whole Number incorrectly.
  2. The salesperson agrees to provide those 300 units for $5,000. No big deal except that they monkey the price field in your database to make it work and you suddenly you have a sales price of $16.666667. Fortunately, this one usually gets rounded via setting it to a Currency Data Type, as that’s logical. But what if you set it to decimal or left it undefined?
  3. Things go the other way and the customer agrees to a package price of $5,000 for something that usually carries a price of 17.00 per unit. It gets rung into the system with a reverse engineered quantity of 294.1176470588235 to get it exactly right.

The last is the real kicker as the memory suddenly flips from Value to Hash, the RAM needed to refresh the data expands exponentially and brings your model to its knees. And now every slicer click has gone from 1 second to update your visuals to the better part of 10 seconds. And your audience is screaming that the model is “unusable”.

My Recommended Practice

The final step in every query I build which loads to a destination (an Excel Table or the Data Model) is to correctly define my Data Types. This does two things for me:

  1. It ensures that I never accidentally load a column with an undefined Data Type. (In the case of Dates, they load to Excel as numbers, and to the Data Model as text!)
  2. It ensures that I’m cutting down to the minimum number of decimals I’ll ever need for my analysis.

Does it take more processing time? Maybe marginally. But does it future-proof my solution to keep it performant? Definitely. And it ensures the time to do so happens in the refresh, not in when the model is being used for analysis.

Some Final Thoughts

The issues you saw here with memory also affect DateTimes in a big way, as they are quite similar to decimal numbers, where time is represented as the fraction of a day.

The article touches on some things from a much bigger topic: how to optimize a Power Pivot/Power BI data model. I actually teach a full day course on this topic for CPABC, where we cover how the Vertipaq engine compresses its data, why you want to avoid calculated columns and fall in love with measures. We make extensive use of Power Query in this course to reshape tables (applying correct data types along the way) so that they can be consumed by the data model in a more efficient manger. We dissect a poorly built model, test its memory and then rebuild it in stages seeing how our changes made an impact.

And if you’re looking for tools to help with this process… stay tuned to this blog. I’m hoping to have an announcement on that front within the next couple of months.

Building BI in Excel Course

Are you interested in learning how to clean and shape data with Power Query, as well as how to model it using Power Pivot? Don’t know which of these mysterious skills to tackle first? Want to learn about building BI in Excel where you create refreshable and maintainable solutions?

Good news: Ken Puls will be in Wellington, New Zealand on February 25-26, 2019 leading a live 2-day, hands-on session covering these essential skills!

What does Building BI in Excel cover?

In Day 1, you’ll learn how Power Query can clean up, reshape and combine your data with ease – no matter where it comes from. You can convert ASCII files into tables, combine multiple text files in one shot, and even un-pivot data. These techniques are not only simple, but an investment in the future! With Power Query’s robust feature set at your fingertips, and your prepared data, you can begin building BI in Excel using Power Pivot. The best part is that these dynamic business intelligence models are refreshable with a single click.

Un-pivoting Subcategorized Data

Un-pivoting subcategorized data is easy with Power Query

Day 2 focuses on Power Pivot, a technology that is revolutionizing the way that we look at data inside Microsoft Excel. Power Pivot allows you to link multiple tables together without a single VLOOKUP statement. It also enables you to pull data together from different tables, databases, the web, and other sources like never before. But this just scratches the surface! We'll also focus on proper dimensional modeling techniques and working with DAX formulas to report on your data the way you need to see it.

Top Selling Servers Report

Build dynamic reports that are easy to filter and refresh

Who is this course for?

Building BI in Excel is for anyone who regularly spends hours gathering, cleaning and/or consolidating data in Excel. It's also valuable for anyone responsible for building and maintaining reports. Participants must have experience using PivotTables. Some exposure to Power Pivot and Power Query is not required but is a bonus.

Where do I sign up?

We are offering this course in conjunction with Auldhouse, a leading computer training company in New Zealand. Go to the Auldhouse site and use the following promo code EARLYBIRD20 to give yourself a 20% discount.

This will knock $300 NZ off the course, bringing it down to $1200. That’s $600 per day for pretty much the best introduction to both Power Query and Power Pivot that money can buy! Then use your new skills to free up 90% of your data-wrangling time, giving you time to negotiate a 20% pay increase*. Unbeatable ROI!

Don't miss out, the early bird discount is only available until January 31, 2019! Visit the Auldhouse site today for full details and registration.

*Numbers are indicative only, your mileage may vary. Heck, it may be way better than that!

Power Pivot eBook Coming Soon

It's been a long time coming, but we are putting the finishing touches on the third installment of our free 'DIY BI' series. Consequently, we are excited to announce that the Power Pivot eBook will be officially released on Tuesday, July 3, 2018!

Power Pivot eBook

This brand new book will feature five of Ken's top tips, tricks, and techniques for Power Pivot, including:

  • Hiding fields from a user
  • Hiding zeros in a measure
  • Using DAX variables
  • Retrieving a value from an Excel slicer
  • Comparing data using one field on multiple slicers

Power Pivot eBook

 

About the 'DIY BI' Series

This free eBook series is available to anyone who signs up for the monthly(ish) Excelguru email newsletter. The series includes four books, one edition each for Excel, Power Query, Power Pivot, and Power BI. Each book contains five of our favourite tips, tricks, and techniques which Ken developed over years of research and real-world experience.

DIYBI eBook Series

We first launched this series in the spring of 2017 with the Excel Edition, and the Power Query edition followed later that summer. You can read some more about why Ken decided to create this series in his initial blog post about it.

The Excelguru Newsletter

The monthly Excelguru email newsletter features the latest updates for Excel and Power BI, as well as upcoming training sessions and events, new products, and other information that might be of interest to the Excel and Power BI community.

Don't Miss Out, Get Your Free Copy of the Series

If you're not already a newsletter subscriber, you can sign up here. We will send you the Excel Edition right away, and the Power Query Edition a few days later. All of our current and new subscribers will receive the Power Pivot edition once it is released on July 3, 2018. Be sure to keep an eye on your inbox for the new book.

We will be continuing to work on the fourth and final book, the Power BI Edition, over the coming months so stay tuned for details!

Protect Power Queries

How you protect power queries is a question that will come up after you've built a solution that relies heavily on Power Query, especially if you're going to release it to other users.

(This is a quick post, as I'm in Australia at the Unlock Excel conference, but still wanted to get a post out this week.)

Can you Protect Power Queries?

The answer to this is yes, you can.  It’s actually very easy, and prevents your users from not only modifying your queries, but adding new queries to the workbook as well. Essentially, it shuts the door on any additions or modifications to query logic, while still allowing queries to be refreshed… at least, it should.

So how do we Protect Power Queries?

To protect Power Queries we simply need to take advantage of the Protect Workbook Structure settings:

  • In Excel (not Power Query), go to the Review tab
  • Choose Protect Workbook
  • Ensure that Structure is checked
  • Provide a password (optional)
  • Confirm the password (if provided)

Once you’ve done this, the Power Query toolsets will be greyed out, and there is no way for the user to get into the editor.

image

Does refresh work when you Protect Power Queries?

This part kills me.  Seriously.

The answer to this question depends on whether or not you use Power Pivot.  If you don't, then yes, you're good to go.  As long as all your tables land on worksheets or as connections, then a refresh will work even when you protect Power Queries via the Protect Workbook method.

If, however, you have a single Power Query that lands in the data model, your stuffed.  If Power Pivot is involved, then the refresh seems to silently fail when you protect Power Queries using this method (and I don't know of another short of employing VBA, which is a non-starter for a lot of people).

It's my feeling that this is a bug, and I've sent it off to Microsoft, hoping that they agree and will fix it.  We need a method to protect both Power Query and Power Pivot solutions, and this would do it, as long as the refresh will consistently work.

Caveats about locking your workbook structure:

Some caveats that are pretty standard with protection:

  • Losing your password can be detrimental to your solution long-term. Make sure you have some kind of independent system to log your passwords so this doesn’t happen to you. And if your team is doing this, make sure you audit them so you don’t get locked out when as staff member leaves for any reason.
  • Be aware that locking the workbook structure also locks the ability for users to get into Power Pivot.
  • Workbook security is hackable with brute force macro code available on the internet for free. (Please don’t bother emailing me asking for copies or links to this code. I don’t help in disseminating code which can be used to hack security.) While protecting the workbook structure will stop the majority of users from accessing your queries, it should not be mistaken for perfect security.

Free ‘DIY BI’ e-Books

Today I wanted to just make a quick announcement that we are currently working on a series of free 'DIY BI' e-Books.

Free 'DIY BI' e-Books? Tell me more!

Over the past few years of working with Excel an Power BI, I've obviously picked up a few different methods, tips and tricks for working with the software.  And looking at how successful our free e-Book "Magic Tricks for Data Wizards" has been through the Power Query Training site, I thought it would be nice to so something similar for Excelguru readers.

One of the cool things about the Excelguru audience at this site is the diversity.  A lot of people originally came here for Excel, but we've been exploring Power Query, Power Pivot and Power BI for the past few years as well.  The one thing that ties us all together is that we are building "Do it Yourself Business Intelligence" or "DIY BI".

My original plan was to release one e-Book with 20 different tips, tricks and techniques; 5 each for Excel, Power Query, Power Pivot and Power BI.  After getting started, however, I realized that it was going to take me a bit longer to get that all done than I wanted.  But since I want to get information out to our readers, I've decided to break this down into four separate e-Books which will be collected under the umbrella of "DIY BI Tips, Tricks and Techniques".  Each e-Book will focus on one specific area of the DIY BI story.

What will the free 'DIY BI' e-Books include?

Well… tips, tricks and techniques, of course.  Smile  Okay, seriously, each is fully illustrated and written to give you some great examples and ideas that I hope will help you in your DIY BI journey.

Here is what is covered in DIY BI Tips, Tricks and Techniques for Excel:

  • The easiest formula to return the end of the month
  • Show a message when cells are hidden
  • Quick alignment of objects
  • Easy to read variances
  • Show a message if your Pivot data is stale

Sample image from DIY BI Tips, Tricks & Techniques for Excel

What areas will the free 'DIY BI' e-Books cover (and when will they be released)?

Those e-Books will be released in the following order:

  • DIY BI Tips, Tricks and Techniques for Excel
  • DIY BI Tips, Tricks and Techniques for Power Query
  • DIY BI Tips, Tricks and Techniques for Power Pivot
  • DIY BI Tips, Tricks and Techniques for Power BI

The first is already written, we just need to lay it out and make it look a bit more awesome.  Our target is to get it released by the end of next week.

With regards to the rest, I'll go as fast as I can on them, but as you can imagine, doing things right does take time.  I would expect that each will take 2-3 weeks to build out properly, but if I can get them out faster I most certainly will.

Am I going to need Excel 2016 to get value from the free 'DIY BI' e-Books?

No.  While I highly advocate being on a subscription version of Excel 2016, you'll find content in each of the first three e-Books which can be used in prior versions of Excel.

How do I receive the free 'DIY BI' e-Books?

You sign up for the Excelguru newsletter.  It's just that easy.  As soon as each e-Book is finished, we'll be emailing it to everyone who is currently subscribed to our newsletter.

And in the mean time, you also get a monthly email from us which now includes news about the latest updates to both Excel and Power BI.

Longer term, once all four e-Books are written, any new subscribers will receive the first e-Book upon signup, and then the next in the series will arrive every couple of days until you have the full set.

So what are you waiting for?  Sign up right here and don't miss out on free DIY BI Tips, Tricks and Techniques for your work!

Subscribe to our mailing list

* indicates required