About me

I work as SQL Server DBA / Developer and BI Consultant in Toronto, Canada.Canada Flag More...
Vidas Matelis picture


Report Portal

SSAS database processing sometimes gets stuck

April 22nd, 2007 by Vidas Matelis

Last week I had time to test if SP2 fixed SSAS database processing problem. On the databases that had a lot of partitions processing task sometimes would get “stuck” – there would be no CPU activity and processing would never get completed. Unfortunatelly I have to report that SP2 did not fix this problem – for me processing routine failed from the first attempt. But the good news is that Microsoft workaround for this problem on SP1 works on SP2. You have to change SSAS server parameter ThreadPool\Process\MaxThreads from default value of 64 to 150. I have tested my processing routines with value 150 and so far everything worked as expected.


Found blog entry from PSS SQL Server Engineers that describes this problem and how to solve it:

Note 2008-Nov-07:

I am finding this issue on majority of servers I am dealing with. But the number MaxThreds could be different between machines depending on the configuration. I found from my experience that value 150 is OK for servers with up to 8 CPU cores. But when server has 16 CPU cores, I had to use value 200. In any way, I recommend that you follow advice from PSS SQL Engineers and sum values of Threads:Processing Pool Idle Threads, Threads:Processing Pool Busy Threads, and Threads:Processing Pool Job Queue Length performance counters and then add 10. You might need to do that a few times, but eventually you will come up with number good for your specific machine. In my experience, after setting new MaxThreds number I had to restart Analsysis Services server, it did not wake up by itself as described in their article.

Posted in SSAS | 8 Comments »

8 Responses

  1. Vidas Matelis Says:

    Found blog entry from PSS SQL Server Engineers that describes this problem and how to solve it:

  2. LeandroTubia Says:

    Hi Vidas!
    We’re facing the same problem with most of the x64 boxes with 16 or more cpus and 40gb ram. We are processing dozens of small partitions within the same transaction.
    When MaxParallel is not specified, SSAS became very optimistic and job scheduler decides to process about 40 partitions in parallel.
    Knowing that each partition consumes three threads and adding the fact that SSAS uses up to CoordinatorBuildMaxThreads value (4) for multiple segments within partition; it results in 3 * 40 * 4 = 480 threads used.
    So we solved partially by setting a MaxParallel value of 4 in xmla processing commands.
    It gave us a well balanced improvement between time and resources consumed.

  3. Vidas Matelis Says:


    I also played with MaxParallel and planning just in a few days publish PowerShell script to process partitions with MaxParallel setting. I also found that MaxParallel works. But I also found that it is a bit harder to get correct MaxParallel value. I found that when I am not using MaxParallel – my server CPU utilization during processing is 100% and I am getting fastest cube processing. Limiting MaxParallel value usually increases processing time and my CPU utilization goes bellow 100%.
    Did you measure how much slower is your cube processing with MaxParallel?

  4. LeandroTubia Says:

    Hi Vidas,

    I agree that finding the exact maxparallel value is just difficult and it depends on each implementation.
    I use resource usage thresholds to find it. I mean that I get care of memory usage and processing queues as the most usual.
    When not using maxparallel we saw that it is more likely that Processing Threads Queue Length rise up to 30 or more, causing processing jobs to wait for free threads.
    One thing that I’ve not said is that the most of our cubes have small or even lack of aggregations.
    I believe that CPU is used more intensively during index and aggregation build, not during reading incoming data, so I guess you must have a great number of aggregations jobs executing at the same time.
    Does it happen during full process ?.

  5. Vidas Matelis Says:


    Most of the measure groups in production have usage based aggregations. As you said aggregations and building indexes on larger partitions uses lots of CPU. Also because of the business reasons, we have some measure groups that are not straight star schemas and require relational joins (and extra CPU). Most of the issues we notice during full re-process , that might take many hours.

  6. LeandroTubia Says:

    Hi Vidas,

    If partition are quite large and MaxParallel option is used, then increasing the CoordinatorBuildMaxThreads parameter to a value of 10 or more could read more partition segments in parallel during index and aggregation creation.

    Another workaround could be separating data processing from index and aggregation.
    You could try both approaches.

    Our processing times did not suffer when using MaxParallel as cubes had to be largely partitioned which allowed SSAS to schedule processing jobs more efficently. I’m talking about 300mill monthly records.

  7. Vidas Matelis Says:


    Thanks for sharing your experience and the tip on CoordinatorBuildMaxThreads. My understanding is that increasing this value will try to use more memory. I did not explain that I have clients where SQL Server and SSAS are on the same machine (for different reasons). I am assuming that you have proper configuration and your SQL Server database is on the different machine than SSAS. That is probably why we see different CPU usage patterns.

    I’ll keep in mind that behaviour is different on different configurations, and I’ll try to change CoordinatorBuildMaxThreads value at some point in the future.

  8. Analysis Services AMO -Powershell script to process SSAS database | Vidas Matelis Analysis Services Blog Says:

    […] my post “SSAS database processing sometimes gets stuck” comments Leandro Tubia posted that reducing the MaxParallel value helps with the “SSAS […]