Sunday, September 25, 2011

Data change alerts in SSRS Denali and Performancepoint Services 2010

I'm reading: Data change alerts in SSRS Denali and Performancepoint Services 2010Tweet this !
Time is money and this philosophy turns into a requirements when it comes to reporting. RSS changed the way people access information, as one no longer needs to continuously poll the source of information to check for updates.

Reporting environments may start with a modest set of reports, but over the period of time the ecosystem may grow to a huge number of reports. In a BI environment, reports may range from dashboards showing very higher level trends and KPIs to basic operational reports filled with lots of numeric data. When the information has changed and What information has changed are two basic updates that any user would like to know, and check out the reports only then. Pushing reports at regular intervals into user's mailbox is a very inefficient practice, unnecessarily increases volume of data in an enterprise, and also creates lot of duplication. And if users have to open reports regularly and analyze whether any data has changed, even that is unnecessary load on the server resources.

Performancepoint dashboards and SSRS reports are the two pillars of reporting in MS BI stack of technologies.

1) One of the upcoming enhancements of SSRS Denali is Data Alerts in a sharepoint integrated mode installation of SSRS, which is facilitated using SQL Batch jobs that keep polling your data for changes based on the rules defined in the alerts. So users would get a notification when data has changed and only then users would have to bother looking at the reports along with clear indications of what has changed.

2) Bidoma alert is a productivity add-on for Performancepoint 2007 and Performancepoint Services 2010 from Bidoma.com. It provides alerting capabilities and reports on different constituents of performancepoint dashboards, basically scorecards and KPI data changes. More about this product can be read from here.

Tuesday, September 13, 2011

SSRS Denali enhancements in Sharepoint integrated mode

I'm reading: SSRS Denali enhancements in Sharepoint integrated modeTweet this !
I apologize to my blog readers that I have been very passive in blogging these days, but I am just recovering from some personal issues and getting on track. I promise to be back on normal pace of my blogging within a weeks time.

Advancements in SSRS Denali, as heard and seen in TechEd seems to be bridging the gaps that should have been ideally filled up in the R2 release itself. But its better late than never !!

1) The first gap that is being filled up is SSRS Sharepoint integration. Though integrated mode has been supported, but implementing this mode has required a cross IT team efforts. Also its a know fact that reports deployed on sharepoint integrated mode, have been found to be performing slower compared to native mode. In my understanding, SSRS Denali brings SSRS as a shared service in sharepoint, and effectively it would benefit from all the advantages of being a shared service in Sharepoint.

On the other side, alarming situation is that endpoints that used to work for sharepoint integrated deployment, might not work in the same manner with SSRS denali. This can be a major migration blow for applications accessing reports programatically from application servers using these endpoints.

2) SSRS Logs have been very limited to ExecutionView3 tables, and rest of the help was provided by tools like Fiddler to troubleshooting. In integrated mode, logging also seems to have been considerably improved.

3) Data Alerts is one of the new enhancements in SSRS Denali, which can be thought of a SQL Scheduled Job implementation that sits in Sharepoint DB, to watch over the change in data. This sounds very good, but it looks little risky from the way it can continuously trouble database servers to check for alerts, as I have not heard about how much control is available of the frequency of alerts.

4) Crescent is another flavor of self-service reporting, requires silverlight and works in a browser in Sharepoint only. Presentation of data looks like fancy, and even controls similar to motion framework for trend-analysis are being introduced in this tool.

5) SSRS Denali is also bringing better export options like ZIP formats, support for Office 2007 based export formats, better compression, better performance in sharepoint integrated mode and more.

From a higher level, most of these are welcome changes. But from an architecture standpoint, it would be interesting to see whether sharepoint integrated mode ssrs deployment, changes architecture in a big way and whether it brings dead end to seamless migration from R2.

Wednesday, September 07, 2011

Intelligent Distribution Analysis using Analyzer

I'm reading: Intelligent Distribution Analysis using AnalyzerTweet this !
Business intelligence reporting tools should be capable of analyzing massive amounts of quantitative data via categorization into distributed frequencies and groups. The intelligence expected here is facilitating analysis of data density distribution across various logically-related or unrelated groups, identifying outliers, quality control, identifying non-linear relationships between different business parameters, and beyond. This corresponds to a branch of statistical analysis known as distribution analysis.

For example, business assignments that are tracked using various project management metrics have cost performance index (CPI) and schedule performance index (SPI) as two of the tracked metrics. A large organization could have several hundred projects running simultaneously. To analyze how the business is performing on cost against schedule, data of all projects is categorized into clusters based on CPI versus SPI, and analysis can be done on clusters to derive the relationship between these parameters as well as performance of projects under different clusters.

Any intelligent distribution analysis starts with the study of higher level composition. In this Analyzer recipe, we would take a look at how a capable reporting tool can help with an interesting and intelligent distribution analysis, and extract insights.



The above screenshot is a typical example of time series analysis using the Adventureworks cube. Sales performance of different geographies is shown. Next logical step for analysis would be studying the distribution of geography in a particular year, and for this, the above graph is not suitable. The first step to study distribution is to merge all individual values in a single entity, i.e. the different bars in any particular year would stand as individual entities. A simple way to merge it into a single entity is by using a stacked bar chart. A stacked column chart would have vertical bars, and if you look carefully, the screen space you would have vertically is much less compared to horizontal screen space. Just with a simple selection, you can change the graph you need for the same data, and the visualization would look like the below screenshot. In case you wish to study only selected geographies, you always have the option to filter out legend and categories.



Pie-charts are widely used for distribution analysis. As we have time-series involved here, we would need a pie-chart for every year. This is as easy as selecting pie-chart graph, but each one points to the distribution of a specific entity for that year, so it should not really be compared across the entire time-series. For example, in the below screenshot, you should not compare the weight of the United States across each year, as the distribution is shown for a specific year. You might feel that CY 2001 has the highest weight of US across all years, but this is not correct. CY 2003 has the highest value and you can see the value when you hover on any pie of the chart. Cross series distribution analysis cannot be done, but you can derive clearly that every time the US had the highest weight age compared to others, which is not that clearly visible when you use a column chart. Keep in mind while considering is distribution analysis is that composition should be studied within the same entity and not across entities.



Senior business management typically uses pyramids to study the composition of different entities, for example a resource pyramid for any particular project. With the selection of a pyramid chart, you can easily achieve the same visualization too. You might be surprised at why all countries are listed in the same hierarchy in all the pyramids, as the performance of each country is varying in each year. The reason is that the Country attribute-hierarchy in the Geography user-hierarchy of the Adventureworks cube is sorted by name. Since Analyzer uses AMO behind the scenes, it will retrieve data in the same order. This works to the benefit of the user. If the sort order of the hierarchy is based on reseller sales amount, the order of the pyramid would also change, which is very much desirable as the flexibility of configuration is left to the discretion of the user. Analyzer also has built-in capabilities to define your own MDX queries, without very detailed knowledge of MDX.



Intelligence in any form of analysis sits in the brain of the individual analyzing the data. With Analyzer one can leave a reasonable level of onus of representing data intelligently on the tool itself, to analyze data using intelligent and interactive visualization suited for different forms of analysis. The above examples of distribution analysis are just of higher level data, but there are more charts options for quantitative distribution analysis using charts like scatter charts and more. You can find out about more such interesting options from the Analyzer website.

Related Posts with Thumbnails