Just got this link in my inbox and thought the BI community may be interested (especially if you live in Germany, Spain, Netherlands or Uruguay – the semi-finalists of this year’s Fifa World Cup) in this demo of thenew PivotViewer SilverLight control:
Even if you are not a soccer fan, you will surely still find it very interesting.
The official site where you can download the control is here: http://www.silverlight.net/learn/pivotviewer/
In the rare cases when we use dynamic SQL and want to use a database name in our code, we are better off avoiding hard-coding them. Unfortunately, I could not find an easy way to access a connection manager’s database name and on my current project the catalog name is not in the SSIS configurations XML file. Therefore, I had to resort to a little trick to pull the database name out and pass it to a stored procedure. In brief we can do the following:
1. Create a user variable database_name
2. Create an Execute SQL Task using the connection manager we want to get the database name from, which does:
SELECT db_name() AS database_name
3. Map the Single Row result set to our database_name variable
4. Place the task created in the previous step before any components which would be using the variable.
5. Pass the variable to our dynamic SQL stored procedure
There we go – a stored procedure configured in the SSIS package configurations – a bit better than just hard-coding the name.
OK, my doubts aside, this is my first post about PowerPivot. Obviously the demand is high and the perspectives bright – so here we go…
There are a number of posts about using PowerPivot as a datasource in PerformancePoint and Reporting Services. You can find two links below (both at TechNet) and I will not repeat the content there:
There you can find a detailed step-by-step approach to using published to SharePoint PowerPivot workbooks in your reports. One thing which is not explained, though, is that you can also use any published DAX measures as report data sources, as well. They appear as physical measures in the PowerPivot Analysis Services instance and can be directly used in the reports. I found this fascinating, as now we can actually integrate even user logic into our reports.
In example, today I was working on a small demo of PowerPivot and I created a PowerPivot report integrating a relational data source and an OLAP data source from two completely different systems, which then got related to each other by State/Province . After that, I created a DAX ratio measure, which showed some relation between the Sales measures from each database. After publishing the workbook to SharePoint, I opened Report Builder 3.0 and to my surprise (well, I expected to see only physical measures for some reason), I was able to pull the DAX measure which I had just created and beautify the map of the USA based on its value.
The integration between the various components in the Microsoft BI stack is continually improving and with the latest few versions of the various components (labelled 2010 or R2), we are being empowered with richer and more powerful tools, covering a larger array of users – proving that the toolset is the best out there for both enterprise-level and relatively smaller customers.
Today I found out about a cool ETL framework being developed by my colleague in Sydney – James Beresford, a.k.a BI Monkey. He is also writing a blog, which I have seen and read in the past.
Check out his take on what an ETL should look like. He told me that the current version (v1) is suitable for smaller projects and that a new one v2 is coming out soon – so if you like v1 definitely stay posted for v2! And in case you have suggestions – he is pretty cool to chat to and very keen to get the framework further, so don’t hesitate to contact him!
Recently I had the (dis)pleasure of working with Business Analysts, who also thought that they are good in dimensional modelling. so, I had to implement BI solutions (including cubes) on top of their database design. I will show an example (about 95% the same as the actual design), where the idea of letting BAs go into dev territory does not yield the best results:
This “dimensional model” was created by an experienced BA. Some “features” are missing here:
1. The fact table had EffectiveFrom and EffectiveTo dates
2. The relationships between some Dim Tables were 1-1 ?!
3. The Time dim (the only one properly implemented on its own – on the bottom of my example) had columns like: DateTimeName nvarchar(100), DateTimeKey nvarchar(100), YearName nvarchar(100), etc..
4. The Some Tables on the top had nothing to do with the rest (in fact a colleague of mine reckons they are there to fill in the white space on the top of the A3 printout)
Another design, which is better, but still pretty bad showed up after my training on Dimensional Modelling (1hr to go through EVERYTHING, including M2M relationships, Parent-Child hierarchies, Type 2 dimensions, etc):
Obviously, the designer (a developer actually) did grasp some concepts. However, my explanation of a star schema must have been not too clear..
Hope that you had some fun with these two diagrams..and I am sure many developers get in a similar situation, especially when someone else designs their databases. But two points:
1. Ask the BAs to analyse the business and their requirements – not to design the database
2. 1 hour of training on dimensional modelling will not make you an expert