Tuesday, June 23, 2009

SQL Server Tools : Qure Product Review

I'm reading: SQL Server Tools : Qure Product ReviewTweet this !
Today I received an email from Ami Levin. He is a SQL Server MVP and having an experience of working more than a decade on SQL Server. He introduced me to a product called "Qure".

I went through the product information page of Qure, and found that it focuses on 4 areas for SQL Server Performance Optimization. They are as below:


1) Smart Indexing - Create, Drop and Modify Indexes for Best Overall Performance
Details can be found on product page.

My View: The product page says that Qure has a patented algorithm which they call as Smart Indexing, which suggests code and places where indexes should be changed. I would like to review and compare, how better is this than Index Tuning Wizard or Database Tuning Wizard results. If it's better than that, may be Microsoft should have review theirs.

2) Query Syntax Optimization
Details can be found on product page.

My View: This is one of nice feature which I like, provided it results in better performance. Generally it's not feasible to change lots of code in production, as there are a lot of approval cycles that goes on. And if just for syntax if I am asked for an approval of modifying 100 objects, I won't approve it. But for example changing 35-50 odd objects can reap me at least 5% performance gain, I would be more than happy to see it happening.

By using a kind of work-around on DBPro this can be achieved. But it won't tell you the performance gain.

3) Schema Optimization
Details can be found on product page.

My View: This feature is like combining data profiling capabilities of SSIS with Database Tuning Advisor. This really makes me curious of exploring, how effective this feature is. If it works to my expectation, it can revolutionize database modeling.

4) Additional Optimizations
Details can be found on product page.

My View: I would like to compare performance of this feature with that of Best Practices Analyzer tool.

By and large, it seems like this is a big thing in a small package. Pricing of this product in alignment with the features offered seems to be a high, compared to Quest software products. Allow me some time to evaluate and compare it with other tools and I would be back with results shortly.

Friday, June 19, 2009

SSIS Installation Acceptance Testing

I'm reading: SSIS Installation Acceptance TestingTweet this !
Installation of systems like ERP / SAP generally requires clients to sign off documents for acceptance testing. But this is generally not required in case of a typical installation for SSIS. Recently I came across a scenario where I was required to provide documentation and deliverables for acceptance testing of SSIS Installation at a client location.

Thanks to Microsoft, that it ships was a dozen of samples. Two of the samples which can be used as a base of such acceptance testing are the "AWDataWareHouseRefersh" and "Data Cleaning".

Execution of "AWDataWareHouseRefersh" sample tests components like Sequence Container, For Loop Container, For Each Loop Container, Execute SQL Task, Bulk Insert Task, Data Flow Task and connection adapters as well.

Execution of "Data Cleaning" sample tests components like Lookup, Fuzzy Lookup, Derived Column, Union All, Conditional Split, Data Flow Task and connection adapters as well.

I feel this much is enough for basic smoke testing of a proper SSIS installation.

Tuesday, June 16, 2009

Dimensional Modeling Cube Structures using SSAS: 10 Rules

I'm reading: Dimensional Modeling Cube Structures using SSAS: 10 RulesTweet this !
  • Load detailed atomic data into dimensional structures
  • Structure dimensional models around business processes
  • Ensure that every fact table has an associated date dimension table
  • Ensure that all facts in a single fact table are at the same grain or level of detail
  • Resolve many-to-many relationships in fact tables
  • Resolve many-to-one relationships in dimension tables
  • Store report labels and filter domain values in dimension tables
  • Make certain that dimension tables use a surrogate key
  • Create conformed dimensions to integrate data across the enterprise
  • Continuously balance requirements and realities to deliver a DW/BI solution that's accepted by business users and that supports their decision-making

Reference: Kimball University

Tuesday, June 09, 2009

Performance Tuning SQL Server Data Types : SQL Server Interview Questions

I'm reading: Performance Tuning SQL Server Data Types : SQL Server Interview QuestionsTweet this !
Selecting inappropriate data types, especially on large tables with millions or billions of rows, can have significant performance implications. Recently I found an article that discusses data type performance to a decent depth. I believe this article would be very beneficial for starters who are new to SQL Server 2005 / 2008. This article can also be a very handy reference for SQL Server Interview questions for data types.

Reference: PASS Performance SIG

Friday, June 05, 2009

Performance Counters Analysis : SQL Server , IIS , MOSS , BizTalk , Exchange , and Active Directory

I'm reading: Performance Counters Analysis : SQL Server , IIS , MOSS , BizTalk , Exchange , and Active DirectoryTweet this !
Logs are one of the fundamental resource where data profiling and collection starts for performance tuning. Reading logs which can exist in a variety of formats starting from text files to Windows Event Logs, and then aggregating information and making sense out of it is not an easy task. This can become more tedious if your job is to keep a constant watch in production environment thru logs.

In case of SQL Server there are two tools that can be of use, if you are dealing heavily with logs. Log Parser Tool and Performance Analysis of Logs Tool. Mentioned below are the description of the tools from their respective project sites:

Log Parser


Log parser is a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows® operating system such as the Event Log, the Registry, the file system, and Active Directory®. You tell Log Parser what information you need and how you want it processed. The results of your query can be custom-formatted in text based output, or they can be persisted to more specialty targets like SQL, SYSLOG, or a chart.

PAL Tool


Ever have a performance problem, but don't know what performance counters to collect or how to analyze them? The PAL (Performance Analysis of Logs) tool is a new and powerful tool that reads in a performance monitor counter log (any known format) and analyzes it using complex, but known thresholds (provided). The tool generates an HTML based report which graphically charts important performance counters and throws alerts when thresholds are exceeded. The thresholds are originally based on thresholds defined by the Microsoft product teams and members of Microsoft support, but continue to be expanded by this ongoing project. This tool is not a replacement of traditional performance analysis, but it automates the analysis of performance counter logs enough to save you time. This is a VBScript and requires Microsoft LogParser (free download).

PAL Tool Features
  • Thresholds files for most of the major Microsoft products such as IIS, MOSS, SQL Server, BizTalk, Exchange, and Active Directory.
  • An easy to use GUI interface which makes creating batch files for the PAL.vbs script.
  • A GUI editor for creating or editing your own threshold files.
  • Creates an HTML based report for ease of copy/pasting into other applications.
  • Analyzes performance counter logs for thresholds using thresholds that change their criteria based on the computer's role or hardware specs.

Thursday, June 04, 2009

SQL Server Standards , SQL Server Naming Conventions , SQL Server Coding Guidelines

I'm reading: SQL Server Standards , SQL Server Naming Conventions , SQL Server Coding GuidelinesTweet this !

Generally once a project kicks off, general guidelines and standards document is treated like a de-facto document for each project (in order to save the project from getting jinxed by ignorant knowledge workers :) ). If you are looking for any standards and guidelines document from the below list, here is the SQL Server Standards Document for download.

SQL Server Naming Conventions and Standards

  • Databases, Files, and File Paths
  • Tables and Views
  • Columns
  • Indexes
  • Stored Procedures
  • Triggers
  • Variables

SQL Server Programming Guidelines

  • Introduction
  • Code Readability and Format
  • Datatypes
  • Stored Procedures
  • Performance Considerations
  • Miscellaneous Topics

SQL Server Security Model

  • General Access Requirements
  • SQL Server Roles

SQL Server Migration

  • RACF Requirements
  • Development Environment
  • Production Migration

SQL Server Execution Environment

  • Non-Web Applications
  • Web Applications

Wednesday, June 03, 2009

ETL Architecture : Metadata driven extract architecture / BI application architecture

I'm reading: ETL Architecture : Metadata driven extract architecture / BI application architectureTweet this !
For a requirement where, real time (or near-real time) movement of data from their operational logistics application to an analytics/reporting visualization environment where presentation tools such as Microsoft Reporting Services, Business Objects or QlikView consume the data is required, there are two approaches: Operational Analysis & Traditional Business Intelligence.

Operational Analysis is based on loading the source data into compressed proprietary data stores without transformations (Cleansing, Normalization, Hierarchies). This will provide analytical capabilities including metrics relationships and hierarchies that exist in the data model being sourced.

The limitations here are no trending, history or complex business rules, metrics that require corporate reference data such as product or organization hierarchies. A benefit in relation to the extraction process is this approach does not require any staging area.

Traditional Business Intelligence is based on a metadata driven approach were the source data is transformed to properly analyze a specific set of business metrics and their associated business process hierarchies, including trending and history.

A limitation of approach will require a more complex extraction and loading process and a staging area.

A major benefits is this approach will "insulate" the reporting or analytical layer from any changes or additions to the source. This is accomplished through using a data driven approach and creating a business dimensional oriented semantic layer. In most cases the metric and dimensions math the business processes and do not change over time, where as the source data and nomenclature is volatile.

Summary:

A best practices Extraction Architecture requires a metadata driven application architecture that would be data aware of structure or content (domain) changes and automatically adjust load processes accordingly.


Reference: Article by Ira Warren Whiteside on SQLServerCentral

Tuesday, June 02, 2009

SQL Server Performance Auditing , Profiling , Monitoring , Logging , Troubleshooting Tools

I'm reading: SQL Server Performance Auditing , Profiling , Monitoring , Logging , Troubleshooting ToolsTweet this !

There are a handful of monitoring tools from Microsoft includes Profiler, Sysmon\Perfmon and the Database Engine Tuning Advisor\Index Tuning Wizard which are already well-known.

Activity Monitor to obtain information about SQL Server processes and how these processes affect the current instance of SQL Server. This is similar to Task Manager for Windows that provides overview displays of the percent of processor time, number of waiting tasks, database I/O (megabytes per second), and batch requests (number per second), which is shown with graphical representation. Also the Active User Tasks, Resource waits, Data file I/O and recent expensive queries.

Using SQL Server Management Studio (SSMS) you can monitor the current activity of all defined jobs on an instance of SQL Server by using SQL Server Agent Job Activity Monitor.

You can monitor a mirrored database during a mirroring session to verify whether and how well data is flowing. To set up and manage monitoring for one or more of the mirrored databases on a server instance, you can use either Database Mirroring Monitor or the sp_dbmmonitor system stored procedures.

Replication monitor provides details on the status of SQL Server replication and allows you to configure replication alerts.

SQL Server Profiler tracks engine process events, such as the start of a batch or a transaction, enabling you to monitor server and database activity (for example, deadlocks, fatal errors, or login activity). You can capture SQL Server Profiler data to a SQL Server table or a file for later analysis, and you can also replay the events captured on SQL Server step by step, to see exactly what happened.

Monitoring the logs is also another tool to provide you the lead to see the activity on the SQL Server, the information in these event logs allows you to view informational, auditing, warning, and error messages that can help you troubleshoot SQL Server problems.

In addition to the SQL Server error log Windows application/system/security event log provides an overall picture of events occurring on the Windows Server and Windows operating systems as a whole, as well as events in SQL Server, SQL Server Agent, and full-text search. It contains information about events in SQL Server that is not available elsewhere. You can use the information in the error log to troubleshoot SQL Server-related problems.

The information in SQL Server Agent Logs allows you to view informational, auditing, warning, and error messages that can help you troubleshoot SQL Server Agent problems.

The best way to get server state, Dynamic Management Views & DMFs return server state information that can be used to monitor the health of a server instance, diagnose problems, and tune performance. There are two types of dynamic management views and functions: Server-scoped dynamic management views and functions that requires VIEW SERVER STATE permission on the server. Database-scoped dynamic management views and functions that requires VIEW DATABASE STATE permission on the database. All dynamic management views and functions exist in the sys schema and follow this naming convention dm_*. When you use a dynamic management view or function, you must prefix the name of the view or function by using the sys schema.

DataBase Console Commands (DBCC) are commonly used statements by every DBA and many times the users whenever a problem is repoted, these are grouped into 4 categories: Maintenance, Informational, Validation and Miscellaneous.

For many administrative and informational purpose the stored procedures that consists in SQL Server system database 'master'. Few of these are listed below that are commonly used in day-to-day monitoring activity, such as sp_helpdb, sp_helpserver, sp_spaceused and sp_who or sp_who2 (undocumented). There are many more system and DB engine related stored procedures that are used for general maintenance of an instance of SQL Server.

Reference: Above article is a summarized excerpt of the original article on SSQA.net

Related Posts with Thumbnails