Thursday, November 8, 2012

Extract Oracle Blob Image to Physical Files on Disk using SSIS

Are you looking for solution to extract Oracle Blob Images to Physical files on Disk using SSIS ?

If yes, then here is the solution.

You can use OLEDB connector to connect to Oracle system , but to gain performance Attunity has came up with connector plugin which is freely distributed on microsoft site ( no license required).

3 steps  in data flow are  :
1)  use Attunity Oracle Connector as source connecting to Oracle DB . It can be downloaded from below link :

2) Derived column which maps base file path ( ex c:\) + File name of file ( we can pull from table or generate new id ) + file mime type ( we can pull from source table  ex: jpg, png etc.)

3) Destination Export column.

Link to Download Attunity Connector


Hope it helps...







Saturday, August 11, 2012

WEBAPI Mesage Handler

I have read an article on WebAPI Message Handler which is very cool. I am not giving any details of it , thinking of duplicate effort but here is the link for it :
Click Here


Thanks to Aliostad for presenting wonderfull article.

Monday, July 30, 2012

SQL 2012 : Column Store Index

Column store Index is one of the new feature introduced by Microsoft in sql server 2012 edition.
This is a new kind of indexing which provides significant performance for queries & ssas for aggregations etc.
In a regular indexing , each row is stored on a page where as in the column store index each column is stored per page which makes it much more efficient for queries such a aggregations etc.. which will have specific columns in it. They are highly compressed , so easy to store large volume of data in  memory resulting in very low IO cost.

How ever there are some limitations , few are mentioned below :
1) they are read only, once created  inserts/ updates are not possible, this can be used in warehousing applications where data is loaded periodically & before data load index can be dropped or disabled & then created once the data load is finished.
2) there can be only one column store index per table
3) it cant be clustered index.
4) cant be pk  or fk
5)cant be used with tables that has cdc or file stream

Reading material :

http://social.technet.microsoft.com/wiki/search/searchresults.aspx?q=SQL+Server+Columnstore
http://blog.sqlauthority.com/2011/10/29/sql-server-fundamentals-of-columnstore-index/
http://blog.sqlauthority.com/2011/11/07/sql-server-video-performance-improvement-in-columnstore-index/
http://blog.sqlauthority.com/2011/11/06/sql-server-updating-data-in-a-columnstore-index/
http://blog.sqlauthority.com/2011/10/30/sql-server-how-to-ignore-columnstore-index-usage-in-query/

SQL Merge + SCD ( slowly changing dimensions)

Sql server Merge is a powerful t-sql statement which can be used to write simple code to do insert, update, delete in one statement instead of writing 3 different statements one for update, one for insert & one for delete.  Using Merge is cleaner & much more efficient when it executes.

Syntax looks like this :


merge in to  dbo.destination as d
using dbo.source as s
on (s.pk= d.pk)
when matched
            then update set d.col1= s.col1
when not matched
         then insert....

How ever Merge can do much more than what is seen here, There is an excellent presentation by Kimball group about how Merge can be used in SCD scenario :

http://www.kimballgroup.com/html/08dt/KU107_UsingSQL_MERGESlowlyChangingDimension.pdf


Thursday, July 26, 2012

Real Time Data Push to SSAS Cube

Real time data push in to SSAS cube is very much possible.

There are different ways of doing it, How ever you have explore & see which one best fits your needs

The below are some of the options :

1) Push the data directly from the WH in to SSAS cubes using SSIS Package.
ex: Lets say there is a product dimension & Orders Fact table , in the SSIS package which loads the data in to these tables we can extend them to push the data to Product dimension & Orders Fact  of OLAP cube.

2) Proactive Caching based on policy based management. The Policy settings  - SilenceInterval, SilenceOverrideInterval, ForceRebuildInterval & latency play very important role.

But how do we get the data changed notifications from the data source, Here are the options :
 1) use event trace entries
 2)  application can notify through webservice call
3) by polling the data source using sql queries to see if data has changed using a timestamp or any other column.


Umbraco + MVC

After the announcement from Umbraco team that V5 was ripped off, we were left with version 4.7 which is latest version we could have at the point of this post is published.
we were not left with much options . so we went ahead with what we have & utilised MVCBridge package installer which gives a way to create a page in Umbraco & give the controller , action names to the call. with this we were able to integrate our existing MVC webapp in to Umbraco ..

We thought issue was resolved , how ever we boggled down with Ajax Calls, then we  went ahead and modified the source code of MVCBridge to make ajax calls with just Controller , Action as we don't have access to HTMLHelper classes & used the Umbraco /base ajax calls with parameters controller & action which will then make a call to routine call from MVCBridge. Even though we don't have helper class to pass parameters still we can pass parameters to ajax  /base class as they will be available in Context object for MVC call.

Hope this helps ..

MOLAP ROLAP HOLAP

MOLAP, ROLAP, HOLAP are terms you will come across in SSAS .
Keeping it simple what they mean are :

MOLAP is Multidimensional OLAP : what ever aggregations, group by etc you can imagine on the query to generate the counts are pre-aggregated & calculated before hand  in cube , so that the execution is fast & don't have to calculate when a request is made.

ROLAP is Relational OLAP : when ever a query is issued , they will be executed against the relational databases to return the resultset. you can imagine this as a stored procedure call.

HOLAP is Hybrid OLAP : It is a Hybrid of above two, HOLAP is used where aggregations, group by etc are used to return the result set & MOLAP is used for regular queries. Sharepoint deployment of reports built on SSAS cubes & SSRS reports mostly use HOLAP.  MOLAP for main repots & HOLAP for drill down reports.


Sunday, January 8, 2012

Message[264] An attempt was made to send an email when no email session has been established

When we configure the database email, we can successfully send the test email.
How ever when we configure to set the alerts to send out email it will fail with the error
Message[264] An attempt was made to send an email when no email session has been established
which is found @ sql server agent error logs.

Fix :

select sql server agent -> right click -> properties -> alert system ->  check enable mail profile
-->  select mail system as Database Mail. --> Mail profile. --> ok  --> restart sql server agent




Error : The specified @server_name does not exist

When we rename the machine name by  default it will just rename the machine name ,where as the sql server still holds the reference to old machine name.When you try to add monitoring jobs or any thing else which needs to take the machine name in to consideration then it will try to look for the old machine name which is unable to access. This error is thrown at that instance.

Fix :

select @@servername , if it is different from machine name
then drop the old & add the new one.


sp_dropserver 'old machine name'
sp_addserver 'new machine name', local
restart the sql server

Saturday, January 7, 2012

SQL Server Configuration Manager Error: Connection to target machine could not be made in a timely fasion

I Faced this error when i was playing  with db mirroring & screwed up the database mirroring settings.
I was not able to open the sql server configuration manager .

Fix is :

go to start -> run -> type service.msc--> right click Windows Management Instrumentation & restart.



Monday, January 2, 2012

Sql Server Log Shipping

Sql server has log shipping as one of the disaster recovery solution. Log shipping provides the automated way of taking the log back ups on one server & restore them on another server.
Log shipping in general will have one primary server with Primary Database , Multiple secondary servers where the log will be shipped to & monitor server used to monitor the activities & raise the alerts when some thing goes wrong. We can use either primary server (or) secondary server to serve the purpose of Monitoring , How ever if some thing goes wrong on the server  then there could be chance to loose the monitoring as well, that's why it is suggested to have a seperate dedicated server for monitoring.

When logshipping is configured ,   4 jobs will be created on the primary , secondary & monitor servers.

1) Back up  job is created on  primary server -  purpose of this job  is to take a database back up

2) Copy job is created on secondary server - purpose of this job  is to copy the back up database to secondary server location

3) Restore job is created on secondary server - purpose of this job is to restore the back up log file that is copied over to secondary server in step 2

4) Alert Job is created on either secondary server (or) monitor server based on where the monitoring is configured , purpose of this job is to alert if some thing goes wrong in the process.

Sql server agent which runs these jobs should have read/ write permission to the location where the back ups will be taken  , copied to.

In this presentation i will show the implementation of logshipping by taking  one default instance & other named instance on the same machine.

1) create a database on primary server.

 CREATE DATABASE [testdb] ON  PRIMARY
( NAME = N'testdb', FILENAME = N'E:\SQLDATA\testdb.mdf' , SIZE = 2048KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
 LOG ON
( NAME = N'testdb_log', FILENAME = N'G:\SQLLOG\testdb_log.ldf' , SIZE = 1024KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)
GO

2)  take back up of the database

 BACKUP DATABASE [testdb] TO  DISK = N'E:\SQLBACKUPS\testdb.bak' WITH NOFORMAT, NOINIT, 
 NAME = N'testdb-Full Database Backup', SKIP, NOREWIND, NOUNLOAD,  STATS = 10
GO

3) restore the database on second server 
RESTORE DATABASE [testdb] FROM  DISK = N'E:\SQLBACKUPS\testdb.bak' WITH  FILE = 1,
  MOVE N'testdb' TO N'E:\SQLDATA\INST1\testdb.mdf',  
  MOVE N'testdb_log' TO N'G:\SQLLOG\INST1\testdb.ldf',
    STANDBY = N'E:\SQLBACKUPS\INST1\ROLLBACK_UNDO_testdb.BAK',  NOUNLOAD,  STATS = 10
GO

4) configuring the primary database : right click the database -> properties -> transaction logshipping
  set the network path where the backups has to be taken. Edit job to schedule the time how frequent the back ups has to be taken.
 5)  Add the secondary server instance , here we can add as many secondary servers as we want .
 6) Restore Transaction Log can be set to  No recovery mode (or) stand by mode . Advantage with stand by mode is the database can be used in read only mode . This is useful for reporting purposes.

7)  Configure the Monitor server , here i am using the named instance as both secondary & monitor server.

















8) Now we will have 4 sql jobs one on primary & rest on secondary servers.

































9) Thats all,  Logshipping is configured.

Monitoring Logshipping :
  Logshipping can be monitor 3 ways

  1) GUI :  Right click on the server instance -> reports -> transaction logshipping
  2) system stored procedures.

      sp_help_log_shipping_monitor_primary 'satishbudati-pc', 'testdb'
     sp_help_log_shipping_monitor_secondary 'satishbudati-pc\inst1', 'testdb'

 3) msdb tables :
 
  select * from msdb..log_shipping_monitor_primary
  select * from msdb..log_shipping_monitor_secondary

Additional Reading Material : http://sqlcat.com/sqlcat/b/technicalnotes/archive/2009/02/24/scheduling-sub-minute-log-shipping-in-sql-server-2008.aspx