Ragged Dimensions and Excel 2007

If you have ragged dimensions, i.e. some paths from the top node to a leaf node differ in length, and you are using Excel 2007 as a front end, then… bad news! It will not work.

Read more »

Synchronize a SQL Server Database with an SVN Repository

Yes, it is possible to automatically synchronize a SQL Server database with an SVN repository. Every change to a table, stored procedure or other database object will automatically be committed to the repository. New objects will get added, deleted objects deleted, etcetera. As a bonus, why not monitor the contents of (some) database tables ? You probably have a few tables that contain parameter values that influence the applied business logic ? It may be useful to put their contents under source control too!

Read more »

A Stored Procedure to Launch a SQL Agent Job and Wait

If you need to launch a SQL Agent Job from within a stored procedure you can simply use msdb..sp_start_job @job. But what if you need to wait for that job to finish before you can continue whatever you are doing in that stored procedure ? Here is the code for a stored procedure that does exactly that: it launches a SQL Agent job and waits for it to finish. The name of the job and the maximum time to wait are its input parameters.

Read more »

Get an Average and a Sum in a Reporting Services Subtotal

You would think that if you wanted something simple like having a subtotal line with a sum and an average that this would be easily configurable in Reporting Services. Alas… Clicking on the Subtotal button magically adds a row to the matrix component. But, you cannot tell it what function to use: it will always show the sum of the numbers above. There is also just one row that can be added.

ReportingServicesSubtotal

Frustrating, if you want to achieve something simple like this:

SumAndAvg

You will need to apply a few tricks to achieve the above result.

Read more »

Deploying a Microsoft BI Solution via Script

At the end of a phase of a business intelligence project, when it has been accepted and it is time to move into production, comes the deployment on the production environment. The deployment itself should also be testable, reproducible and of course be completely debugged. The method largely depends on the customer’s IT team’s way of working. Some will insist on Installshield executables, some will prefer moving a server from acceptance to deployment, some will restore backups, some will work script based. I prefer that last approach and go into some detail on how to do this for all the typical parts of a MS BI solution: the database, integration services packages, SQL Agent jobs and the SSAS cubes.

Read more »

Drillthrough with Multiple Selected Values in Excel2007

Excel2007 is a great tool for power users/analysts to access a SSAS cube. They can slice ‘n dice the data as never before and even make appealing reports on the fly. Yet the new drillthrough capability is not completely bug free. First of all, it is impossible to drillthrough if the user has selected multiple values in a filter hierarchy or in a row or column axis. Okay, that is a restriction with the current state of SSAS2005. (I’ll explain how to get around it via MDX in another post.) What is worse is that in certain situations it does allow a drillthrough but then displays wrong results.

Read more »

Dimension Security and Drillthrough with SSAS2005

We are facing a problem when applying simple dimension security: it disables drillthrough.

The business case is simple enough: there are two usergroups, one that is denied access to a specific dimension and one that is allowed to see its contents. (In theory we would like to completely hide that
dimension for the first group, but that is impossible – vote for this issue here.)

To implement this we have set the Allowed Member Set to {} for all attributes of the particular dimension. It works as expected.

But… the drillthrough stops working for the users that have no access to the dimension (the drillthrough does not contain elements from that dimension). There are no frivolous calculated measures, no cell security, …

Read more »

SQLStatementSource via an Expression: Be Careful

Often when using parameters in an Execute SQL task the thing just does not understand me. Or at least not what I am trying to do with the parameters. It seems the SQL Native Client is even more dumb when it comes to passing parameters than the MS OLE DB Provider for SQLServer, though I have not done a thorough investigation to identify the dumbest of the two.
A workaround is to put the SQL statement as an expression in the SQLStatementSource property. But be careful… this has already cost me hours of unnecessary debugging. With a simple trick this waste of time can easily be avoided.

Read more »

One-click Deploy SSIS Package from Visual Studio to SQLServer

Visual Studio can deploy Analysis Services and Reporting Services projects to a server but there is no such option for Integration Services. Yet in my current setup I need to run packages on the server: multiple packages maintained by different people are part of a larger chain. deploymnuExecuting only one package makes no sense in this case. Sidenote: another reason that forces me to run packages on the server is because anything that has “TransactionOption = Required” fails to run from a client pc. If somebody knows of a way to make it run, please leave a comment…

Luckily there is an easy way to add the deployment capability to VS. In short: a custom action calls a batch command file that in turn uses dtutil to actually deploy the SSIS package.

Read more »

Too Many Copy & Paste Operations May Confuse SSIS

Apparently its easy to confuse SSIS if you make too many copy and paste operations inside a Data Flow Task. In this case I added six data flows going through six Slowly Changing Dimension Transformations to one Data Flow Task. I know, I could have split the DFL into 6 seperate DFLs, but the data flows are very small. Each flow needs tracking information from a SSIS variable. That is done with a Derived Column Transformation. Each flow receives the same tracking variable so the Derived Column Transformation was configured once and then copied five times. The Data Flow Task runs fine but ends with an error even though all components inside the DFL are marked green! The DFL is marked red in the Control Flow editor.

The error thrown was:

The variable “System::LocaleID” is already on the read list.

The workaround:
- create a new package
- copy the misbehaving Data Flow Task and paste it in the new package
- delete the Data Flow Task
- copy and paste from the new package

Maybe it has something to do with IDs of components that are not getting updated correclty ?
Related: New-suggestion-for-templates-in-Katmai

Next Page »