I just came back from SQL PASS Summit 2009. About a week before that I started to use Twitter and I tried to “tweet” during the conference as much as I can. My Twitter ID is: http://twitter.com/VidasM. I decided that some of my posts could be interesting to people who do not use Twitter, so I cleaned them up a bit and posted them here.
For the last 2 years I attended the Microsoft BI conference which I enjoyed very much. This year I’ll be going to Seattle to attend the PASS Community Summit 2009 (Nov 3-5th, 2009). Although the PASS Community Summit is not a BI specific conference, I am quite surprised that there will be so many BI sessions. In my event planner I added 3-4 sessions that I would love to attend for almost each time slot . The good news is that all of the sessions will be recorded and attendees have a choice – watch them for free later online, or just buy the DVD with recordings.
This year, I’ll be volunteering at some of the PASS Community Summit events.
On Tuesday, November 3rd during lunch (between 11:45am and 1:00pm) there will be a “Birds of a Feather” event. That is a topic-based luncheon where a number of tables will be marked and discussion facilitated by MVPs or Microsoft employees. I will be volunteering at this event and for my table I choose the topic: “Gemini’s impact on SSAS/Data Warehouse projects“. As you know, with the release of Gemini, power Excel users will be able load data from different sources and build very powerful pivot reports without any help from IT.
Yesterday I posted about my tests working with Gemini and bigger tables. I realized myself and Chris Webb also suggested that my method of generating new records by simply duplicating them probably affected my results. So I ran more tests with different data. Read the rest of this entry »
Gemini was released about a week ago and I was playing with it almost every evening. I am still trying to understand what it is, how it works, what its limitations are, how to use DAX, etc. And I can say that I like what I see so far. Kasper de Jonge and Chris Webb already posted their initial Gemini reviews. In this post I will share my experience with Gemini so far.
First of all I want to point that for Gemini tests all you need is Excel 2010 and the Gemini Add-in file (it is just about 30MB). You do not need SQL Server 2008R2. I am pointing this out because as I was setting up my testing environment I installed SQL Server 2008R2 and later realized that it is not required. It is quite amazing that this 30MB add in file contains such powerfull software – the Gemini front-end, and the in memory Analysis Services server. Of course for Enterprise level installation you will need SQL Server 2008R2 and Sharepoint 2010, but such setup is not part of my current tests. Read the rest of this entry »
Few years ago on my blog I posted scripts “SSIS Package to drop/create partitions based on partition list in the SQL Server table” and “SSIS package that process all partitions/measure groups/cubes in one database“. These posts contained partial scripts that I developed for company “Insight Decision Solutions Inc.” to maintain partitions in the Microsoft SQL Server Analysis Services. This company sells, customizes and implements pre-packaged data warehouse solution (using SQL Server, SSAS, SSIS, SSRS, SharePoint and Excel 2007) for “Life” and “Health” insurance companies. Recently “Insight Decision Solutions Inc.” owners let me post full script on how to automate SSAS partition management . Here are step by step instructions that I adjusted and tested on Adventure Works database. Most of the code comes from my earlier published posts, here I just added information how everything works together.
New SSAS 2008 DMVs allows you to easily access Microsoft SQL Server Analysis Services (SSAS) metadata – information about cubes, dimensions, measure groups, measures, etc. As with DMVs metadata information is returned in the data set format, it is very easy to build Reporting Services reports to generate documentation about your database.
You can execute SSAS DMV queries directly in the Analysis Services environment, but this approach has a lot of limitations – most importantly you can not do joins between DMVs. To go around this limitation, I created linked server from SQL Server to Analysis Services. This way I can do joins between DMVs. Here is the script that was used to create linked server: Read the rest of this entry »
I pre-ordered ”Microsoft SQL Server 2008 Analysis Services Unleashed” (Amazon links: US, UK and Canada) some time ago and today I finally received it. The authors are Irina Gorbach, Alexander Berger and Edward Melomed. This book is the second edition of the older book “Microsoft SQL Server 2005 Analysis Services” (Amazon links: US, UK and Canada). It has the same structure as the older book – the same 9 parts, but the 41 chapters are slightly different.
I recently posted a new PowerShell script to process all dimensions and cubes in the specified Analysis Services database. This script has a few additional parameters that give you more control for your SSAS DB processing. First of all, you can specify the MaxParallel processing parameter to control the level of parallelism. You can also specify how many processing commands you want to execute per batch. Read the rest of this entry »