Report Portal

Good news on the future of Analysis Services

Reposted from Chris Webb's blog with the author's permission.

If you read my blog, you’re probably aware of a post from last November I wrote about the future of Analysis Services while I was at the PASS Summit. It caused an awful lot of fuss and ever since I’ve not been able to go for more than 5 minutes at a SQL Server event without somebody stopping me and asking “So, Chris, is SSAS really dead?” (the answer is an emphatic NO, by the way). Anyway, as you may or may not have already seen, there’s been a new announcement on the future of Analysis Services at this year’s TechEd and an accompanying blog post from TK Anand here:
http://blogs.msdn.com/b/powerpivot/archive/2011/05/16/analysis-services-vision-amp-roadmap-update.aspx

…and as a result of my involvement in the public debate on this subject, I feel obliged to add my comments on it. I’ll do that by extracting and discussing the main points made in TK’s post, but before I start let me state clearly that this new announcement is extremely positive in my opinion and contains news that I am very glad to hear.

In the past six months, we have been talking to many people in the Microsoft BI community – customers, partners, developers, and MVPs – to get feedback on the roadmap.

The first thing I’d like to say is how much my respect has grown for the SSAS development team over the last six months, as a result of how they’ve handled the debate over the future of their product. There were some heated discussions going on in November but once everyone had calmed down, all of the misunderstandings had been cleared up and we had apologised to each other – and I certainly had much to apologise for, as much as anyone else – there were some very fruitful discussions about how to move forward. I don’t expect Microsoft do develop products to my precise specifications, and nor should they; back in November I thought I’d made it clear that there were good reasons for going down the BISM route and I understood why they were made. However today’s announcement does address all of the genuine (as opposed to the purely emotional) reasons I had for being upset. It is not a U-turn, more of a modification to what was presented before, but it is nonetheless significant and shows that the SSAS team have taken on board the feedback they’ve received.

The BI Semantic Model is one model for all end user experiences – reporting, analytics, scorecards, dashboards, and custom applications

So what is the new direction, then? Reading the quote above, those of you with long memories will remember the promise of the UDM back in SSAS 2005: it too was meant to be a single reporting model for all purposes. It didn’t work out that way, of course, despite the massive success of SSAS 2005; furthermore I suspect there will be plenty of people who read TK’s blog post and are still confused about the exact story because it’s quite complicated (again, I’m reminded of the confusion around what the UDM exactly was in the 2005 release). But the overall aim of BISM in Denali is to make good on this old promise of ‘one model to rule them all’ and I think Microsoft has a much, much better chance of succeeding this time.

In purely technical terms the story is almost, but not quite, the same as it was in November. The BI Semantic Model (BISM) is the new name for Analysis Services. In Denali when you develop with BISM you will have a choice of two types of project which represent two design experiences: use the tabular model to get all the advantages of a table-based approach, DAX queries and calculations, and Vertipaq storage; use the multidimensional model to get the rich functionality of the cubes we have today, MDX and MOLAP storage (I would urge you to read and reread TK’s post to get the full details on this). What has a new emphasis is that these are two complementary models that have their strengths and weaknesses and are appropriate in different circumstances; it’s not a case of old versus new or deprecated versus the future, they are two sides to the same coin. What’s more there’s now firm evidence that MS will be paying more than lip-service to this idea:

The multidimensional project lets model developers use the multidimensional modeling experience along with MDX and MOLAP/ROLAP (this is what existing UDM projects get upgraded to).  The tabular project lets model developers use the tabular modeling experience along with DAX and VertiPaq/DirectQuery.  It is important to note that these restrictions in the two projects are not rigid or permanent; they could very well change in future releases based on customer requirements.  For example, we could offer model developers VertiPaq as a storage option for multidimensional projects or MDX scripts for tabular projects. Another limitation in the upcoming CTP release is that models built using the multidimensional project will not support DAX queries (and thereby Crescent, which uses DAX to retrieve data from the model). We recognize that removing this restriction is very important for customers with existing Analysis Services solutions to be able to upgrade to SQL Server “Denali” and leverage Crescent. The product team is actively working on it and is committed to making this functionality available to customers.

This is the important section of the post, and it addresses my two main complaints from last year. The first is that the multidimensional way of modelling should not be jettisoned because it has inherent advantages for certain types of BI application, such as financial applications with complex calculations. Let me be clear: I’m not talking about the underlying technology here (I couldn’t care less if MOLAP storage disappeared tomorrow and was replaced by Vertipaq if it meant my queries ran faster), but the basic concept of modelling data as cubes, dimensions and hierarchies instead of as a series of tables and relationships. The tabular way of modelling data has many advantages of its own of course, and in the long run I think that for something like 30% of BI projects the tabular model will be the correct choice, for 10% the multidimensional model will be the correct choice and for the remaining 60% either will work just as well. The point is, though, that we will have a choice between two different but equally valuable ways of modelling data and that by committing to the delivery of new functionality in the multidimensional model in the future, choosing to use it today on a project no longer looks like a bet on a defunct technology.

My second complaint was that there was that there was no good migration roadmap for existing customers who have invested heavily in SSAS over the last few years, beyond telling them to redevelop their applications from scratch in the tabular model. As I said in the previous paragraph it’s now clear that there will be no need to redevelop using the tabular model in the long term if there’s no desire to do so because, even if there’s not much new for them in the Denali release, there will be new functionality coming at some point as the technologies underpinning the two models come closer together. For example, now we know that we will one day have DAX support for the multidimensional model, we know that people will be able to use Crescent on top of their existing cubes just by upgrading them and without completely redeveloping them. The cool new stuff will not be restricted to users of the tabular model.

Overall, then, I’m very happy with what’s been said. I’ve been working with Analysis Services since the very beginning, when it was OLAP Services, and I’m now getting ready to learn BISM and prepare for the future – a future that looks a lot brighter to me now. Bring on the next CTP of Denali!


chris-webb

Chris has been working with Microsoft BI tools since he started using beta 3 of OLAP Services back in the late 90s. Since then he has worked with Analysis Services in a number of roles (including three years spent with Microsoft Consulting Services) and he is now an independent consultant specialising in complex MDX, Analysis Services cube design and Analysis Services query performance problems. His company website can be found at http://www.crossjoin.co.uk and his blog can be found at http://cwebbbi.wordpress.com .


Tags: powerpivot

 

2007-2015 VidasSoft Systems Inc.