Out of Memory error consistently happening with Power Query

akephart

New member
Joined
Oct 12, 2016
Messages
12
Reaction score
0
Points
1
Location
Nanaimo
Excel Version(s)
365
Hi,

I have created a query - Connection Only. I have multiple columns that I have been cleansing. Sometimes after I have done multiple steps I realize the results are not quite what I thought. I have found when I go back and delete some steps that I will usually get an Out of Memory error at some point. I have learnt the hard way that I should save and back out of Excel before I try this so I don't loose my progress. But this time when I reloaded I got an Out of Memory error also. So now I'm perplexed. I have sent some frowns to Microsoft but I'm wondering anybody else has had this happen. I use Excel 2010.
 
Hard to say without seeing your data, but I'm curious on a couple of things...
-How much data are you bringing in. (How many rows/columns is the source data?)
-How much RAM do you have in your PC?
-Are you running 32bit Office 2010 or 64bit Office 2010 (Office, not Windows) You can find out from File --> Help --> About Microsoft Excel. My system says 14.0.7173.5000 (64 bit)

The reason I ask is that if you're doing heavy processing with a lot of data and you're running 32 bit office, this could give rise to this situation. With 32 bit Office I believe you can only address a max of 2GB of RAM with Excel, and that has to support all open workbooks, data and addins. Closing other workbooks and unloading addins MAY give you a bit more to work with there, but an upgrade to the 64 bit version would probably be the best option.
 
Out of memory Error

I have to revisit this issue again as I have had no success in resolving this memory escalation issue.
I constantly have to have Windows Task manager open to monitior memory escalation and save my work before it locks up with a memory error.
I have had no success with incrementally rebuilding my Power query project to see if I could identify some sort of runaway process.
My only current thought thought is that I have some query(ies) that are self referential - repeatedly referencing themselves in a loop and finally maxing out memory.
Right now Excel is locked up at 2.9 GB...
At some points I have noticed ram usage escalating to (say) 2GB then, for no apparent reason, it will swing down to 300,400 or even 800MB, which to me sounds reasonable. But It never remains there - it drifts between OK to dangerously high.
I feel my scripting practices are not suspect - but I have no way of confirming this.
Moving to 64bit is not an option for a number of reasons, not least of which is the client systems are 32bit. And I would rather solve the root problem than manage the symptom.

I will try to persist in solving the problem but I believe my moneys on garbage removal. At this very moment the Excel project is running at 54MB!
Please say its garbage removal!!!

Fran

Your opinion?
 
As you are mentioning self referencing code: in this link I posted an example of a query with a recursive function (calling itself).
Essential to have this working correctly is that the table is buffered somewhere in the function, like I did in the step:

Code:
Expanded = Table.Buffer(Table.ExpandTableColumn(MergedTable, "NewColumn", {"Index"}, {NewIndexColumn})),

Hope this helps.
 
The other thing you may want to try here is making some changes to the defaults for PQ. If you go into the Power Query options, try playing with these:
-Global
-->Data Load --> Fast Data Load = on
-->Data Load --> You could try clearing the cache... I don't know if that will help, but it may
-->Privacy --> Always ignore (this can drastically speed up your queries)

-Current Workbook
-->Data Load --> Allow data preview to load in background = OFF

Those may reduce the amount of load you're putting on your workbook, so it's worth having a toggle of some of them.

HTH,
 
One of the issue that I have found working with 400 MB text files is that PQ has a hard time with the quotes (") symbol. I have removed those by opening my text file and then find/replace (since I don't need them in anyway) and that has also not only improve performance but also avoids issues of columns misplacement because PQ reads the quotes (") as the end of the column.
 
One of the issue that I have found working with 400 MB text files is that PQ has a hard time with the quotes (") symbol.

I think this really depends on how you are importing the file. Power Query can be set to read the " character as a delimiter or not as you choose. Column misplacement isn't really related to the " character, but rather the system that is generating the data for you. If that system gets it wrong, that will be the issue.

You can also pull the data into Power Query as a plain text file without replacing the quotes, then user Power Query to remove them all.
 
Hi Ken,

It makes sense to replace the " character in Power Query. Thank you
 
Another Option

I use Office 365 / Excel 2016 32-bit to be compatible
Lot's of memory issues

Power BI Desktop is an option:
- free download, standalone program
- 64 bit
- updated monthly
- seems stable
- you can transfer Power Query to / from Excel (Advance Editor --> Copy and Paste)
- combines Power Query / Power Pivot / Power Map
 
Back
Top