|Woops I ran out of memory while processing my tabular model|
|Written by Kasper de Jonge|
|Sunday, 15 July 2012 16:53|
I have been playing with a model that barely fits on my 4 GB VM (on purpose to test something). 1 GB is already taken by other models and OS. I imported 6 identical tables that contain 2 million rows in my model using SSDT one table at the time.
I had no problem here, the model took 2.5 GB of memory. I then deployed the model to the server without processing it, just metadata. I then closed SSDT, that would unload the SSDT workspace and free up the memory. So now I have 3 GB free and a non processed database on the server.Read more...
Latest Author Articles
- Help, I received a pivoted data file that I want to combine with my data in PowerPivot
- When Importing from Tabular Model use DAX
- Analyze a Twitter feed with Excel 2013, DataExplorer and GeoFlow
- NBA team spending and their results with Excel, PowerPivot and Data Explorer
- What is eating up my memory the PowerPivot / Excel edition