{"id":258,"date":"2013-10-26T13:55:18","date_gmt":"2013-10-26T18:55:18","guid":{"rendered":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/?p=258"},"modified":"2013-10-25T18:32:25","modified_gmt":"2013-10-25T23:32:25","slug":"powerpivot-tips-and-tricks-for-it-pros","status":"publish","type":"post","link":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/258_powerpivot-tips-and-tricks-for-it-pros","title":{"rendered":"PowerPivot Tips and Tricks for IT Pros"},"content":{"rendered":"<p>On Oct 26, 2013 I presented at &#8220;Big Data Saturday&#8221; event in Toronto, Canada. Here I am posting my session key points and notes.<\/p>\n<h2>Power Pivot Database<\/h2>\n<ul>\n<li>Make a copy of your workbook and change extension from .xlsx to .zip. Inside zip file in the folder &#8220;xl\\customData&#8221; there is file item1.data that is backup of Power Pivot database.<\/li>\n<li>When Power Pivot engine starts it restores database backup on your machine in the folder %temp%. Power Pivot will create sub-folder &#8220;Vertipaq_&lt;GUID&gt;&#8221; where in .db sub-folder you will find folder for each table in your model. Tip &#8211; go into biggest table folder, order files by size and see if you really need columns that use most of the space.<\/li>\n<\/ul>\n<p><!--more--><\/p>\n<h2>Power Pivot Application performance tuning<\/h2>\n<p>For data to be updated in Pivot Tables\/Charts you have to consider 3 load stages:<\/p>\n<ol>\n<li>Loading of Power Pivot model into memory. The bigger the model (amount of data and number of tables\/attributes) the longer it takes. There is no direct performance tuning here.<\/li>\n<li>Processing of Power Pivot model &#8211; that is loading data from source into Power Pivot model. Most important tip &#8211; do not load columns that you really do not need. Also, calculated columns are materialized in this step and might affect processing performance.<\/li>\n<li>Querying of Power Pivot model &#8211; that is refreshing Pivot Table\/Chart results with queries from Power Pivot database. Most performance tuning time will be spent on writing better DAX calculated measures.<\/li>\n<\/ol>\n<p>Tip: you should know how long each step takes for your sample set of data.<\/p>\n<p>In Excel 2013 status bar messages are:<\/p>\n<ul>\n<li>&#8220;Loading Data Model&#8230;&#8221; &#8211; loading Power Pivot model into memory.<\/li>\n<li>&#8220;Retrieving Data&#8230;&#8221; &#8211; Processing data (loading from source into Power Pivot model)<\/li>\n<li>&#8220;Reading Data&#8230;&#8221; &#8211; Querying data from Power Pivot model and updating Pivot Tables\/Charts.<\/li>\n<\/ul>\n<p>Processing and Querying performance can be tuned using Power Pivot trace.<\/p>\n<p>For Processing performance apply filter for &#8220;Progress Reports \/ Progress Report End&#8221; EventClass and then add filter for column duration (example &gt;= 35ms). Most important EventSubclass values are:<\/p>\n<blockquote><p>1 \u2013 Process (Partition, Table , MG , Calc Column, DB)<br \/>\n17 \u2013 ReadData<br \/>\n43 \u2013 Analyze\\Encode Data<br \/>\n54 \u2013 Hierarchy Processing<\/p><\/blockquote>\n<p>For Querying performance tuning start by filtering for &#8220;Query End&#8221; Even and apply additional duration. This way you will be able to identify slow Pivot Tables\/Charts. Then it is best to use SSAS Tabular Instance to continue tuning. Another option is using &#8220;DAX Studio&#8221;.<\/p>\n<p>Tip: Enabling Tracing can affect Power Pivot tuning if your DAX formulas are more complicated. You might see queries running longer and trace file growing to 10s or 100s of megabytes<\/p>\n<h2>Power Pivot Dynamic Management Views (DMVs)<\/h2>\n<p>To run DMV query in Excel 2010 use DMV trick:<\/p>\n<blockquote><p>Double click inside the Pivot (drill-through), right mouse click on new table, menu item \u201cTable\u201d then select \u201cEdit Query\u2026\u201d and enter DMV query<\/p><\/blockquote>\n<p>Tu run DMV query in Excel 2013 use DAX Query Table trick:<\/p>\n<blockquote><p>Select Cell outside pivot, menu \u201cData\u201d, \u201cExisting Connections\u201d, \u201cTables\u201d, select any table then \u201cOpen\u201d in new worksheet, then right mouse click, \u201cTable\u201d &gt;\u201dEdit DAX\u201d, change command type to \u201cDAX\u201d and enter DMV query<\/p><\/blockquote>\n<p>My most used Power Pivot DMV queries are:<\/p>\n<blockquote><p>&#8212; &#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;- List of available DMV tables<br \/>\nSELECT TABLE_NAME<br \/>\nFROM $SYSTEM.DBSCHEMA_TABLES<br \/>\nWHERE TABLE_SCHEMA = &#8216;$System&#8217;<br \/>\nORDER BY 1<\/p>\n<p>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;- List calculated measures<br \/>\nSELECT CUBE_NAME AS PerspectiveName<br \/>\n, MEASUREGROUP_NAME AS TableName<br \/>\n, MEASURE_NAME AS MeasureName<br \/>\n, [DESCRIPTION] AS MeasureDescription<br \/>\n, EXPRESSION AS MeasureExpression<br \/>\n, MEASURE_IS_VISIBLE AS IsVisible<br \/>\n, DEFAULT_FORMAT_STRING AS FormatString<br \/>\nFROM $SYSTEM.MDSCHEMA_MEASURES<br \/>\nWHERE MEASURE_AGGREGATOR=0\u00a0 &#8212; Just include calculated measures<\/p>\n<p>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;- Tables violating Referential Integrity<br \/>\nSELECT *<br \/>\nFROM $SYSTEM.DISCOVER_STORAGE_TABLES<br \/>\nWHERE RIVIOLATION_COUNT &gt; 0<\/p>\n<p>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;- Tables record count and columns distinct count<br \/>\nSELECT DIMENSION_NAME AS TableName<br \/>\n, TABLE_ID<br \/>\n, ROWS_COUNT AS RecCount<br \/>\nFROM $SYSTEM.DISCOVER_STORAGE_TABLES<br \/>\nORDER BY ROWS_COUNT DESC<\/p>\n<p>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;- Table Column Memory used (by segment). Need aggregation<br \/>\nSELECT DIMENSION_NAME AS TableName<br \/>\n, COLUMN_ID AS ColumnID<br \/>\n, SEGMENT_NUMBER AS SegmentNo<br \/>\n, USED_SIZE \/ 1024 AS SizeKB<br \/>\nFROM $system.DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS<br \/>\nORDER BY USED_SIZE DESC<\/p>\n<p>&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;- Memory used (just hash encoded columns, not value encoded columns)<br \/>\nSELECT DIMENSION_NAME AS TableName<br \/>\n, ATTRIBUTE_NAME AS ColumnName<br \/>\n, DataType<br \/>\n, DICTIONARY_SIZE \/ 1024.0 AS SizeKB<br \/>\nFROM $system.DISCOVER_STORAGE_TABLE_COLUMNS<br \/>\nWHERE dictionary_size &gt; 0<\/p><\/blockquote>\n<h2>Power Pivot Tools<\/h2>\n<ul>\n<li>Workbook Size Optimizer: <a href=\"http:\/\/tinyurl.com\/PPSizeOptimizer\">http:\/\/tinyurl.com\/PPSizeOptimizer<\/a><\/li>\n<li>DAX Studio: <a href=\"http:\/\/daxstudio.codeplex.com\">http:\/\/daxstudio.codeplex.com<\/a><\/li>\n<\/ul>\n<p>DAX studio could be used to run DAX queries against your Power Pivot database. To clear cache provided functionality (menu item) does not work. Instead use XMLA query:<\/p>\n<blockquote><p>&lt;ClearCache xmlns=&#8221;http:\/\/schemas.microsoft.com\/analysisservices\/2003\/engine&#8221;&gt;<br \/>\n&lt;Object&gt;<br \/>\n&lt;DatabaseID&gt;E53BEF70384B40109FC4&lt;\/DatabaseID&gt;<br \/>\n&lt;\/Object&gt;<br \/>\n&lt;\/ClearCache&gt;<\/p><\/blockquote>\n<p>To get DatabaseID go into %temp%\\Vertipaq_&lt;GUID&gt; folder and use sub-folder with extension &#8220;.db&#8221;. File name part will be DatabaseID: &lt;DatabaseID&gt;.db.<\/p>\n<h2>Other<\/h2>\n<ul>\n<li>Power Pivot Applicaton example: <a href=\"http:\/\/PortfolioSlicer.com\">http:\/\/PortfolioSlicer.com<\/a><\/li>\n<li>Must read whitepaper: &#8220;<a href=\"http:\/\/msdn.microsoft.com\/en-us\/library\/dn393915.aspx\">Performance Tuning of Tabular Models in SQL Server 2012 Analysis Services<\/a>&#8220;<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>On Oct 26, 2013 I presented at &#8220;Big Data Saturday&#8221; event in Toronto, Canada. Here I am posting my session key points and notes.<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[20],"tags":[],"aioseo_notices":[],"_links":{"self":[{"href":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/wp-json\/wp\/v2\/posts\/258"}],"collection":[{"href":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/wp-json\/wp\/v2\/comments?post=258"}],"version-history":[{"count":10,"href":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/wp-json\/wp\/v2\/posts\/258\/revisions"}],"predecessor-version":[{"id":269,"href":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/wp-json\/wp\/v2\/posts\/258\/revisions\/269"}],"wp:attachment":[{"href":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/wp-json\/wp\/v2\/media?parent=258"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/wp-json\/wp\/v2\/categories?post=258"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.ssas-info.com\/VidasMatelisBlog\/wp-json\/wp\/v2\/tags?post=258"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}