Wednesday, December 27, 2006

Don't start the user defined stored procedure with "SP_"

As you might be knowing the system stored procs would be prefixed with "SP_". If we prefix "sp_" in our user-defined stored procedure it would bring down the performance because SQL Server always looks for a stored procedure beginning with "sp_" in the following order:

1) Master DB,
2) The stored procedure based on the fully qualified name provided,
3) The stored procedure using dbo as the owner, if one is not specified.

So, when you have the SP with the prefix "sp_" in the DB other than master, the Master DB is always checked first, and if the user-created SP has the same name as a system stored proc, the user-created stored procedure will never be executed.

For example, Let's say that by mistake you have named one of your user defined stored procedure as "sp_help" within the database!

Create proc sp_help
as
Select * from dbo.empdetails


Now when you try executing the stored procedure using the below script you would realize that it has actually called the sp_help system stored proc of Master DB and not your SP.

Exec sp_help

Hope this helps!!

Technorati tags: , , ,

Monday, December 25, 2006

sp_executesql( ) vs Execute() -- Dynamic Queries

There were few questions regarding "Passing table names as parameters to stored procedures" in Dotnetspider forums. I don't feel this to be a write way of coding. Still many persons are asking similar questions in the forums thought would write a post on "SP_EXECUTESQL()" Vs "EXECUTE()".

Sample SP to pass table name as parameter:

Create proc SampleSp_UsingDynamicQueries
@table sysname
As

Declare @strQuery nvarchar(4000)
Select @strQuery = 'select * from dbo.' + quotename(@table)

exec sp_executesql @strQuery -------- (A)
--exec (@strQuery) ---------------------- (B)
go

Test: Execute dbo.samplesp_usingdynamicqueries 'EmpDetails'

In the above stored procedure irrespective of whether we use the line which is marked as (A) or (B) it would give us the same result. So what's the difference between them?

One basic difference is while using (A) we need to declare the @strQuery as nvarchar or nchar.

(i) It would throw the below error if we declare @query as varchar data type.
Msg 214, Level 16, State 2, Procedure sp_executesql, Line 1
Procedure expects parameter '@statement' of type 'ntext/nchar/nvarchar'.

(ii) It would throw the below error if we declare @query nvarchar(5000).
Msg 2717, Level 16, State 2, Procedure SampleSp_UsingDynamicQueries, Line 1
The size (5000) given to the parameter '@query' exceeds the maximum allowed (4000).


Hope the first point is well made!

Lets move to the next point, Exec statement is Unparameterised whereas sp_executeSql is Parameterised. What does it mean?

1. EXECUTE() :: If we write a query which takes a parameter lets say "EmpID". When we run the query with "EmpID" as 1 and 2 it would be creating two different cache entry (one each for value 1 and 2 respectively).

It means for Unparameterised queries the cached plan is reused only if we ask for the same id again. So the cached plan is not of any major use.

2. SP_EXECUTESQL() :: In the similar situation for "Parameterised" queries the cached plan would be created only once and would be reused 'n' number of times. Similar to that of a stored procedure. So this would have better performance.

Let me create a sample to illustrate this:

Create table dbo.EmpDetails
(
EmpID int,
EmpName varchar(30)
)
Go

Insert into dbo.EmpDetails values (1, 'Vadivel')
Insert into dbo.EmpDetails values (2, 'Sailakshmi')
Insert into dbo.EmpDetails values (3, 'Velias')
Go

Create table dbo.EmpTimeSheet
(
EmpID int,
Day varchar(10),
HrsPut float
)
Go

Insert into dbo.EmpTimeSheet values (1, 'Mon',7.5)
Insert into dbo.EmpTimeSheet values (1, 'Tue',2)
Insert into dbo.EmpTimeSheet values (1, 'Wed',8)
Insert into dbo.EmpTimeSheet values (2, 'Mon',9)
Insert into dbo.EmpTimeSheet values (2, 'Tue',8.3)
Insert into dbo.EmpTimeSheet values (2, 'Wed',11)
Go

Time to test it out:

DBCC Freeproccache
Go

Declare @strQuery nvarchar(1000)

Select @strQuery = 'Select E.EmpName, TS.Day, TS.HrsPut
from dbo.empdetails E, dbo.EmpTimeSheet TS
where E.EmpID = TS.EmpID and TS.EmpID = N''1'''
Exec (@strQuery)

Select @strQuery = 'Select E.EmpName, TS.Day, TS.HrsPut
from dbo.empdetails E, dbo.EmpTimeSheet TS
where E.EmpID = TS.EmpID and TS.EmpID = N''2'''
Exec (@strQuery)

Select @strQuery = 'Select E.EmpName, TS.Day, TS.HrsPut
from dbo.empdetails E, dbo.EmpTimeSheet TS
where E.EmpID = TS.EmpID and TS.EmpID = @EmpID'
Exec sp_executesql @strQuery, N'@EmpID int', 1
Exec sp_executesql @strQuery, N'@EmpID int', 2

After this lets have a look at the cached plan by executing the below query. The first two (Unparameterised) has a execution_count of 1, the last one (Parameterised) would have an execution_count of 2.

Select sqlTxt.text, qStats.execution_count from sys.dm_exec_query_stats qStats
Cross Apply (Select [text] from sys.dm_exec_sql_text(qStats.sql_handle)) as sqlTxt option (Recompile)

DBCC Freeproccache has been used to just flush out all already cached plan and make our life easier to see our queries plan alone :)

Extract from MSDN: Use DBCC FREEPROCCACHE to clear the procedure cache. Freeing the procedure cache causes, for example, an ad hoc SQL statement to be recompiled instead of reused from the cache.

Please note: Run this statement only in Development Server and not in Production box :)

If you want to test it in SQL Server 2000 box, then query the system table "syscacheobjects".

Select cacheobjtype, usecounts, sql from syscacheobjects

Hope this helps!

Related Article: The curse and blessings of Dynamic SQL

Technorati tags: ,

Friday, December 22, 2006

I am now a Microsoft Certified Technology Specialist!

Let me start by saying, I have never taken a Microsoft exam previously. I was having a "Microsoft Exams 100% Discount Coupon" which was valid till 31-Dec-2006 so thought of utilizing it.

I prepared well for 70-431 Microsoft SQL Server 2005 - Implementation and Maintenance paper and today morning at 10.45 AM I took up the online test @ NIIT Adyar. I am glad that I have cleared it with a good score of 982 (out of 1000). Now I am a "Microsoft Certified Technology Specialist" :)

There were 52 questions and out of which approximately 15 where simulation questions (that was where I spent most of my time). Though the score might look big, I need to accept that I answered 20 to 30% 0f questions without much of confidence. Hope the choices I made where accidently correct :)

I was bit tensed in the morning till the time I answered the first question. Reason being, couple of my friends know that I am going to take up the test today. I was afraid what would they think if I fail in this exam :) I am happy I have proved to myself that one can pass a MS exam even without dumps :)

Technorati tags: ,

Sunday, December 17, 2006

Banks : Do Not Disturb Me

As per RBI regulation I guess all banks should have a "Do Not Distrub Me" or "Do not call me" :) web page inorder to value customers privacy.

I get atleast couple of telemarketing calls a day from one bank or other. Its really irritating and i was looking for a way to avoid them totally. Only then i came to know about these "Don't call registeration pages" for various banks. Looooong Live RBI :) :)

1. ICICI Bank - As per the site itseems it would take 15 days for the number to be removed from the telemarketing list!

2. Citibank -- I didn't find the time frame in this site.

3. Deutsche bank -- As per the site itseems it would take 30 days for the number to be removed from the telemarketing list!

4. Standard Chartered Bank -- As per the site itseems it would take 30 days for the number to be removed from the telemarketing list!

5. HDFC Bank -- As per the site itseems it would take 45 days for the number to be removed from the telemarketing list!

6. ABN Amro Bank -- I didn't find the time frame in this site.

Hope this would be useful for many of you guys too :) By the way, if you guys find any such pages for Hutch /Airtel please let me know. Those guys are also really irritating :(

Wednesday, December 13, 2006

Rolling back a truncate operation!

First I suggest you to go through my earlier article on subject "Delete VS Truncate" here.

Truncate is also a logged operation, but in a different way. It logs the deallocation of the data pages in which the data exists.

Let us create a dummy table for understanding this example:

Create Table dbo.TruncateTblDemo (Sno int)
Go


Insert few records into it:

Insert into dbo.TruncateTblDemo values (1)
Insert into dbo.TruncateTblDemo values (2)
Insert into dbo.TruncateTblDemo values (3)
Go

You could see that the table has 3 records in it:

Select * from dbo.TruncateTblDemo

Execute a truncate statement:

Truncate Table dbo.TruncateTblDemo

After the above statement you can't retrieve the data back because it is an explicit transaction. Unless or until you have set Implicit_Transactions to off.

That is, Internally sql would have taken our above truncate statement as follow:

Begin Tran
Truncate Table dbo.TruncateTblDemo
Commit Tran

Select * from dbo.truncatetbldemo

Getting back data using Truncate!!

Let me insert the same records into that table for test purpose again.

Insert into dbo.TruncateTblDemo values (1)
Insert into dbo.TruncateTblDemo values (2)
Insert into dbo.TruncateTblDemo values (3)

Begin Tran
Truncate table dbo.truncatetbldemo
Rollback Tran

Now you could see those 3 records which were truncated:

Select * from dbo.truncatetbldemo

Clean up:

Drop table dbo.truncatetbldemo

Technorati tags: ,

Tuesday, December 12, 2006

Windows development chief: 'I would buy a Mac if I didn't work for Microsoft'

Check out this article in computerworld.com. May be after this atleast we need to take our company policies very seriously :) How many of you guys are deleting unwanted mails from your official mailbox as per your company policy?

Technorati tags:

Friday, December 08, 2006

Fun with SQL Server ...

Read this first:

1. This has no real time usage. So try this out only when you are free :)
2. This script was just done for fun and nothing more.
3. Before executing this script change your result display mode to 'Text' (Ctrl + T)
4. There are easier way of doing the same thing. Just to make it look complex I have done this way :)

Code snippet Starts here:

Set nocount on
Declare @TblLayout table([ID] int Identity, Canvas Char(75))

Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)
Insert into @TblLayout Select Replicate (Convert(Varchar, 0x7E), 75)

--- Vz employees House ...
Update @TblLayout Set Canvas = Stuff(Canvas, 42, 24 , Convert(Varchar, 0x7C5F5F5F5F5F5F5F5F5F5F5F5F5F5F7C5F5F5F5F5F7C))Where ID = 5


Update @TblLayout Set Canvas = Stuff(Canvas, 42, 24 , Convert(Varchar, 0x7C2020205B20205D20205B20205D207C20202020207C)) Where ID = 4

Update @TblLayout Set Canvas = Stuff(Canvas, 42, 24 , Convert(Varchar, 0x2F2F2F2F2F2F2F2F2F2F2F2F2F2F2F2F5C5C5C5C5C5C)) Where ID = 3

Update @TblLayout Set Canvas = Stuff(Canvas, 43, 23 , Convert(Varchar, 0x2F2F2F2F2F2F2F2F2F2F2F2F2F2F2F2F5C5C5C5C)) Where ID = 2

Update @TblLayout Set Canvas = Stuff(Canvas, 44, 21, Convert(Varchar, 0x5F5F5F5F5F5F5F5F5F5F5F5F5F5F5F5F5F5F)) Where ID = 1

-- Picture of a Guy ...
Update @TblLayout Set Canvas = Stuff(Canvas, 10,10, Convert(Varchar, 0x285F5F5F5B5F5F5F29)) Where ID = 27

Update @TblLayout Set Canvas = Stuff(Canvas, 11,8, Convert(Varchar, 0x7C5F5F7C5F5F7C)) Where ID = 26

Update @TblLayout Set Canvas = Stuff(Canvas, 11,8, Convert(Varchar, 0x7C20207C20207C)) Where ID = 25

Update @TblLayout Set Canvas = Stuff(Canvas, 11,8, Convert(Varchar, 0x7C20207C20207C)) Where ID = 24

Update @TblLayout Set Canvas = Stuff(Canvas, 10,9, Convert(Varchar, 0x5C7C5F5F5F5F2F202F)) Where ID = 23

Update @TblLayout Set Canvas = Stuff(Canvas, 9,11, Convert(Varchar, 0x5C207C20416553202F202F)) Where ID = 22

Update @TblLayout Set Canvas = Stuff(Canvas, 9,11, Convert(Varchar, 0x2F5F2F20415453205C5F5C)) Where ID = 21

Update @TblLayout Set Canvas = Stuff(Canvas, 10,9, Convert(Varchar, 0x2F202060606020205C)) Where ID = 20

Update @TblLayout Set Canvas = Stuff(Canvas, 11,7, Convert(Varchar, 0x5F5C5F755F2F5F)) Where ID = 19

Update @TblLayout Set Canvas = Stuff(Canvas, 11,7, Convert(Varchar, 0x5C20302030202F)) Where ID = 18

Update @TblLayout Set Canvas = Stuff(Canvas, 11,7, Convert(Varchar, 0x2F2F2F2F2F2F2F)) Where ID = 17

--Time to travel
Update @TblLayout Set Canvas = Stuff(Canvas, 11,3, Convert(Varchar, 0x282029)) Where ID = 6

Update @TblLayout Set Canvas = Stuff(Canvas, 10,3, Convert(Varchar, 0x282029)) Where ID = 7

Update @TblLayout Set Canvas = Stuff(Canvas, 7,27, Convert(Varchar, 0x5F5F2F202F5F5F5F5F5F20202020205F5F5F5F5F5F5F5F5F)) Where ID = 8

Update @TblLayout Set Canvas = Stuff(Canvas, 5,27, Convert(Varchar, 0x287C202020202020202020207C2020207C2020202020202020207C)) Where ID = 9

Update @TblLayout Set Canvas = Stuff(Canvas, 6,26, Convert(Varchar, 0x7C205B5D205B5D205B5D207C292D287C205B5D205B5D205B5D7C)) Where ID = 10

Update @TblLayout Set Canvas = Stuff(Canvas, 6,27, Convert(Varchar,
0x7C5F5F5F5F5F5F5F5F5F5F7C2020207C5F5F5F5F5F5F5F5F5F7C)) Where ID = 11

Update @TblLayout Set Canvas = Stuff(Canvas, 7,24, Convert(Varchar, 0x303030303030303030302020202020303030303030303030)) Where ID = 12

-- Art by ....
Update @TblLayout Set Canvas = Stuff(Canvas, 54,14, Convert(Varchar, 0x417274204279205661646976656C))Where ID = 27

Update @TblLayout Set Canvas = Stuff(Canvas, 32,20, Convert(Varchar, 0x5368616D656C6573732070726F6D6F74696F6E3A))Where ID = 28

Update @TblLayout Set Canvas = Stuff(Canvas, 52,23, Convert(Varchar, 0x7661646976656C2E626C6F6773706F742E636F6D203A29))Where ID = 28

Select Canvas [ ] FROM @TblLayout


Closeing Notes: I actually got this idea after going through Roji's post here

Technorati tags: ,

Thursday, November 30, 2006

Swapping two integer values in C# [Interview Question]

Offlate, many of my friends where talking about this problem. It seems they ask this question frequently in Microsoft Interviews :) I remember asking this question to freshers in 2005!

Just thought I would refresh my knowledge also on this :) So here are few samples which I tried for your reference.

Method 1: Using intermediate temp variable

int intNumOne = 1, intNumTwo = 2;
int intTempVariable;

//Swapping of numbers starts here
intTempVariable = intNumOne;
intNumOne = intNumTwo;
intNumTwo = intTempVariable;

Response.Write("Value of First Variable :: " + intNumOne.ToString() + "<br>");
Response.Write("Value of Second Variable :: " + intNumTwo.ToString() + "<br>");

Method 2: Without using Temp Variable and by using 8th standard Mathematics :)

int intNumOne = 11, intNumTwo = 22;

//Swapping of numbers starts here
intNumOne = intNumOne + intNumTwo;
intNumTwo = intNumOne - intNumTwo;
intNumOne = intNumOne - intNumTwo;

Response.Write("Value of First Variable :: " + intNumOne.ToString() + "<br>");
Response.Write("Value of Second Variable :: " + intNumTwo.ToString() + "<br>");

Method 3: Without using Temp Variable and by using bitwise XOR (^ symbol in C#)

int intNumOne = 9, intNumTwo = 1;

//Swapping of numbers starts here
intNumOne ^= intNumTwo;
intNumTwo ^= intNumOne;
intNumOne ^= intNumTwo;

Response.Write("Value of First Variable :: " + intNumOne.ToString() + "<br>");
Response.Write("Value of Second Variable :: " + intNumTwo.ToString() + "<br>");

Related articles ::

1. Still Confused or if at all you are not aware of how XOR works, check out http://en.wikipedia.org/wiki/Xor_swap_algorithm to get some insight into this.

2. Temp variable usage seems to be the best method (performance wise) as per this -- http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=457179&SiteID=1

Wednesday, November 15, 2006

Different Types of Partitioning Operations in SQL Server 2005

In this post let me explain about the three different types of Operations one can do with Partitions. They are:

1. Split Partition
2. Merge Partition
3. Switch Partition (Important of the lot)

Before reading further, make sure that you have read my earlier posts. That is, this and this.

Split Partition

For splitting a partition we need to make use of “Alter Partition Function” syntax. So in our existing “Partition function” lets create a new range with boundary value “Jan 01, 1970”.

Alter Partition Function PF_DOB_Range()
Split Range ('01-01-1970')

Now if we execute this code snippet it would throw an error something like this:

Msg 7707, Level 16, State 1, Line 1
The associated partition function 'PF_DOB_Range' generates more partitions than there are file groups mentioned in the scheme 'PS_DOB_1'.


So the way to split a partition is:

Step 1: Create a new Filegroup (if at all already you don’t have an extra filegroup)
Step 2: Make use of that Filegroup while altering the Partition Scheme.
Step 3: Add a “File” to the newly created Filegroup.
Step 4: Now execute the above “Alter Partition Function”

Step 1: Create a new Filegroup (if at all already you don’t have an extra filegroup)

Alter Database VadivelTesting Add Filegroup FileGrp3
Go

Step 2: Make use of that Filegroup while altering the Partition Scheme.

Alter Partition Scheme PS_DOB_1
Next Used FileGrp3;
Go


Step 3: Add a “File” to the newly created Filegroup.

Alter Database VadivelTesting
Add File
(
Name = Vel2006NewFile,
Filename = 'c:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\Vel2006NewFile.ndf',
Size = 1MB,
Maxsize = 100MB,
Filegrowth = 1MB
)
To FileGroup FileGrp3


Verifying Purpose

If it is mentioned as "Range Right" then it works like this:

1. All records whose DOB is < Jan 1, 1960 would fall into Partion 1
2. All records whose DOB is >= Jan 1, 1960 and < Jan 1, 1970 would fall on Partion 2
3. All records whose DOB is >= Jan 1, 1970 and < Jan 1, 1980 would fall on Partion 3
4. All records whose DOB is >= Jan 1, 1980 would fall under Partition 4.

Till now there were only 3 partitions for dbo.empDetails table. Now that we have added another new partition execute the below code to see the new partition which got created.

Select * from dbo.EmpDetails Where $PARTITION.PF_DOB_Range(DateOfBirth) = 4

Merge Partition

If for some reasons we need to MERGE a partition then its also possible. We need to make use of the same syntax which we used for “Splitting partitions” and just replace the word “Split” with “Merge” :)

Alter Partition Function PF_DOB_Range()
Merge Range ('01-01-1970')

After executing the above statement we would be having the same old 3 partitions as we saw in this post.

Switch Partition

Extract from MSDN


Switches a block of data in one of the following ways:

· Reassigns all data of a table as a partition to an already-existing partitioned table.
· Switches a partition from one partitioned table to another.
· Reassigns all data in one partition of a partitioned table to an existing non-partitioned table.

Lets try to move the data in one of the already existing partition to a new table. For ex: Lets move Partition 3 of dbo.EmpDetails table to a dbo.EmpDetails_StagingTable.

--- Create a new table with the same structure of EmpDetails table.
Create Table EmpDetails_StagingTable
(
EmpID int identity,
EmpName Varchar(25),
DateOfBirth datetime,
Salary int
)
Go

---Try to switch the records in Partition 3 into this.
Alter table dbo.EmpDetails Switch Partition 3 To dbo.EmpDetails_StagingTable;
Go

If we execute the above statement. It would throw an error something like this:

Msg 4939, Level 16, State 1, Line 1
ALTER TABLE SWITCH statement failed. table 'VadivelTesting.dbo.EmpDetails_StagingTable' is in filegroup 'PRIMARY' and partition 3 of table 'VadivelTesting.dbo.EmpDetails' is in filegroup 'FG3'.

Hope the error message is self-explanatory.

::Tip 1

To overcome this we need to create that staging table on the same filegroup of Partition 3 (or what ever partition you are trying to move).

Drop table dbo.EmpDetails_StagingTable
Go

Create Table EmpDetails_StagingTable
(
EmpID int identity,
EmpName Varchar(25),
DateOfBirth datetime,
Salary int
)
on FG3;


::Tip 2

The staging table needs to be empty if we plan to SWITCH records from a partition to this. For example lets us insert the below record into that table and then try to Swtich partition.

Insert into dbo.EmpDetails_StagingTable values ('V','1976-03-23',500)
Go

Alter table dbo.EmpDetails Switch Partition 3 To dbo.EmpDetails_StagingTable;
Go

Msg 4905, Level 16, State 1, Line 1
ALTER TABLE SWITCH statement failed. The target table 'VadivelTesting.dbo.EmpDetails_StagingTable' must be empty.

So delete that record and then try this.

Delete from dbo.EmpDetails_StagingTable
Go


Alter table dbo.EmpDetails Switch Partition 3 To dbo.EmpDetails_StagingTable;
Go

-- To verify whether the records have been moved from Partition 3 to this newly created table.
Select * from dbo.EmpDetails_StagingTable;

Records in all other partitions would be there as-is except for this 3rd partition. You can verify it by executing this line.

Select * from dbo.EmpDetails Where $PARTITION.PF_DOB_Range(DateOfBirth) = 3

Monday, November 13, 2006

Example for Creating and using Partitions in SQL Server 2005

Lets assume that we have table which contains records of our company transaction starting from the date when our company was started 15 years ago! Hope you would understand that the table would have hell a lot of data as it would be holding 15 years of data. But effectively we might be using only last 2 months or 1 year data at the max (very frequently).

For each query, it would be processing through this huge data. Bottomline as the table grows larger the performance would go for a toss, also scalabiity and managing data would also be difficult.

Hope I have made the point clear. With the help of partitioning a table we can achieve a great level of performance; also managing tables easier. In our case, one of the way to increase the performance would be to “Partition” the data on a yearly basis and stored on a different filegroup (SQL 2005 allows you to partition tables based on specific data usage patterns using defined ranges or lists).

For further theoritical knowledge on this subject check out my earlier post on this topic.

OK enough of theory :) let’s get into the example right away.

--Add a filegroup
Alter Database VadivelTesting Add FileGroup FileGrp1
Alter Database VadivelTesting Add FileGroup FileGrp2

--Add few Files into the existing FileGroups.
Alter Database VadivelTesting
Add File
(
Name = Vel2006Q3,
Filename = 'c:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\Vel2006Q3.ndf',
Size = 1MB,
Maxsize = 100MB,
Filegrowth = 1MB
)
To FileGroup FileGrp1

Alter Database VadivelTesting
Add File
(
Name = Vel2006Q4,
Filename = 'c:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\Vel2006Q4.ndf',
Size = 1MB,
Maxsize = 100MB,
Filegrowth = 1MB
)
To FileGroup FileGrp2

-- Creating a partition function
Create Partition Function PF_DOB_Range (DateTime)
As Range Right
For Values
(
'01-01-1960',
'01-01-1980'
)


While creating a “Partition Function” ranges are of two types. They are, “Range Right” and “Range Left”. If it is mentioned as "Range Right" then it works like this:

1. All records whose DOB is < Jan 1, 1960 would fall into Partion 1
2. All records whose DOB is >= Jan 1, 1960 and < Jan 1, 1980 would fall on Partion 2
3. All records whose DOB is >= Jan 1, 1980 would fall under Partition 3.

If it is mentioned as "Range Left" then it works like this:

1. All records whose DOB is <= Jan 1, 1960 would fall into Partion 1
2. All records whose DOB is > Jan 1, 1960 and <= Jan 1, 1980 would fall on Partion 2
3. All records whose DOB is > Jan 1, 1980 would fall under Partition 3.

-- Creating a partition scheme
Create Partition Scheme PS_DOB_1
As Partition PF_DOB_Range To
(FileGrp1, FileGrp2, [PRIMARY])

-- Create a Partioned Table using the above Scheme.
Create Table EmpDetails
(
EmpID int identity,
EmpName Varchar(25),
DateOfBirth datetime,
Salary int
)
On PS_DOB_1(DateOfBirth)

Tip: In the create table syntax after ON keyword normally FileGroup name would be specified. But in the above example we have mentioned a Partition Scheme name. On seeing a script the way to identify whether it is a Scheme name or FileGroup name is by checking whether it has any parameters or not. That is, Scheme name comes with a parameter where-as Filegroup doesn't have a parameter.

-- Insert Dummy records
Insert into dbo.EmpDetails values ('Vadivel','1976-03-23',5000)
Insert into dbo.EmpDetails values ('Sailakshmi','1975-03-23',6000)
Insert into dbo.EmpDetails values ('Vicky','1986-04-30',7000)
Insert into dbo.EmpDetails values ('Test','1954-02-02',2000)
Insert into dbo.EmpDetails values ('Sivaji','1960-02-02',3000)
Insert into dbo.EmpDetails values ('Xyz','1960-01-01',2000)
Insert into dbo.EmpDetails values ('SSS','1980-01-01',3000)
Insert into dbo.EmpDetails values ('AAA','1959-12-31',3000)


That’s all. Going forward based on the select query “SQL Engine” internally decides on the “Partition” to query and fetches the record automatically. Bottomline, we need not do anything specific to segregate records from different partition while writing to the T-SQL statements or Stored Procedure.

That said, if at all you want to verify the partition number into which a set of partitioning column values would be mapped for any specified partition function, you can make use of $Partition function. In other words, $Partition is used to find the data’s in appropriate partition.

i) Fetch all records which got inserted into First Partition
Select * from dbo.EmpDetails Where $PARTITION.PF_DOB_Range(DateOfBirth) = 1

ii) Fetch all records which got inserted into Second Partition
Select * from dbo.EmpDetails Where $PARTITION.PF_DOB_Range(DateOfBirth) = 2

iii) Fetch all records which got inserted into Third Partition
Select * from dbo.EmpDetails Where $PARTITION.PF_DOB_Range(DateOfBirth) = 3

iv) This query would fetch you the partition number and the number of records in it.
Select * from Sys.Partitions where object_id = object_id('VadivelTesting.dbo.EmpDetails')

Or

Select object_name(object_id), partition_number, rows from Sys.Partitions where object_id = object_id('VadivelTesting.dbo.EmpDetails')

Another way to verify the partition ranges!

Select $partition.PF_DOB_Range(DateOfBirth) As 'Partition Number', count(*) As 'Rows' From dbo.EmpDetails Group by $partition.PF_DOB_Range(DateOfBirth) Order by 'Partition Number'

Or

Select EmpID,EmpName, Salary, $partition.PF_DOB_Range(DateOfBirth) AS 'Partition Number' From dbo.EmpDetails Order by 'Partition Number', EmpID

v) To display the list of filegroups in a database and to know which FileGroup is the default.
Select * from sys.filegroups


I strongly suggest you to read this MSDN article without fail.

Technorati tags:

Sunday, November 12, 2006

App_offline.htm – ASP.NET 2.0 new feature

It’s an interesting find. I got few mails asking me suggestions on the way to handle situations where “the application needs to be updated with the latest code base”. I was preparing a blog post something like this:

1. Normally sites are deployed in Web farm scenarios. If your app is also deployed in a web farm then it’s better to bring one server down update the code base there while all the user request would be served by the other servers in the farm. This way the downtime of the application would be almost zero.

2. If at all your application is deployed on a single server then either you need to face the downtime :) or temporarily create another virtual directory with the old code base. This Virtual directory would be functional till the time you update the actual directory with your latest code base. There would be some negligible amount of downtime here.

3. If you can’t create a new virtual directory for some reason! Then create a static page (“SiteDownForMaintanance.htm”) and based on a flag in your “web.config” redirect all request to this static page. Once the code base has been updated, change the Boolean flag in your “web.config” to false.

I am sure there might be other options to resolve this too. So I started (re)searching through the web to read how experts have handled similar situations in their projects. During one such (re)search I found about “App_offline.htm” concept in ASP.NET 2.0.

The concept seems to be really simple / cool. Let me explain it for those who haven’t heard about it.

1. We need to create an html page with name “App_offline.htm”. Put whatever junk content you want to display to your user during the application maintenance!

2. Now run any page in your application to see the content of “App_offline.htm” to be displayed.

3. Once you have done with your updates or maintenance remove this file from your project. That’s all.

Though I have read that “Excluding this file from project” is also enough it didn’t work for me. It worked only when I removed it from the project or renamed it to something else.

From ScottGu’s blog i came to know that
“The way app_offline.htm works is that you place this file in the root of the application. When ASP.NET sees it, it will shut-down the app-domain for the application (and not restart it for requests) and instead send back the contents of the App_offline.htm file in response to all new dynamic requests for the application. When you are done updating the site, just delete the file and it will come back online.”

Interesting isn’t it?

There is another point which I would like to bring to your notice. From ScottGu’s blog I came to know about this as well
“One thing I pointed out in the talk that you want to keep an eye on is a feature of IE6 called "Show Friendly Http Errors". This can be configured in the Tools->Internet Options->Advanced tab within IE, and is on by default with IE6. When this is on, and a server returns a non HTTP-200 status code with less than 512 bytes of content, IE will not show the returned HTML and instead substitutes its own generic status code message”

I am not sure whether he is wrong or in IE 6.0 SP2 they have changed it! because I created an “empty App_offline.htm” page and it worked. The Browser Version in my system is IE 6.0 SP2 and “Show Friendly Http Error Messages” is by default checked in it.

Technorati tags: ,

Saturday, November 11, 2006

Time to say, Goodbye to Adobe PDF Reader!

PDF (Portable Document Format) reader is a very important software one needs. As now-a-days most of the product user manual, eBooks, visa application forms etc., are in PDF format.

Today’s computers almost always come with Adobe PDF reader installed by default. Till few weeks back I was also using it without much satisfaction!! No doubt Adobe PDF reader is a great product but I hate it for the following points:

1. I feel that Adobe PDF reader software is really bulky.
2. Loading time of PDF document is unnecessary in Adobe reader
3. Installation of Adobe PDF reader takes at least >= 5 minutes.

For past few weeks I am fiddling with another PDF reader called “Foxit Reader 2.0”.
In my experience with this new reader, I haven’t found any of the above mentioned disadvantages which I have with Adobe PDF reader.

Let me explain those 4 points in detail now:

i) I feel that Adobe PDF reader software is really bulky.

First of all downloading Adobe PDF reader isn’t an easy joke :) it takes at least 10 minutes to download using a dial-up connection. You know the size of the latest Adobe PDF reader 7.0.8? Its ~16.4 MB!!!

Just for reading an eBook (for example) why the hell one should download such a big file?
I was really surprised and happy to see that download file size of Foxit Reader is just 1.56 MB.

Find below the screenshot of the time it took for me to download Foxit Reader:

[As of now I couldn't attached the screenshots here via blogger.com. May be some issues with the server. So would upload it later]

2. Loading time of PDF document is unnecessary in Adobe reader

Adobe PDF reader installs lots of plug-ins most of us are unnecessary for us. That’s the major reason for the download size to be more. Also even when you open a single page PDF file it would be pretty slow. It’s because it does lots of background check/work which I am not interested at all!

To increase the performance of Adobe PDF Reader, check these below links:

i) Version 6.0 : http://blogs.msdn.com/tims/archive/2004/11/24/269567.aspx
ii) Version 7.0 : http://blogs.msdn.com/jonathanh/archive/2004/12/22/330288.aspx

Why should one download and install all unwanted plug-in / features into the system and then remove it in the name of increasing the performance? This wastes my internet charges, telephone charges, my time :) etc.,


If you have ever been frustrated with how Acrobat drains system resources, then this PDF reader is for you.

FoxIt Reader loads / opens the PDF document real quick.

3. Installation of Adobe PDF reader takes at least >= 5 minutes

Adobe PDF Reader takes at least 5 minutes to install where-as Foxit Reader just got over within a minute! When I completed the wizard and clicked on “Finish” button it just took few seconds for the software to install.

[As of now I couldn't attached the screenshots here via blogger.com. May be some issues with the server. So would upload it later]

Based on these factors I guess FoxIt is the best option than Adobe PDF reader for simply going through a PDF file or filling a PDF form. What do you think?

Update: Amit Agarwal (MVP) infomed me about Adobe Digital Editions. May be I would try it out sometime tomorrow and update this post based on that.

Updated on November 13th: I got my hands dirty with "Adobe Digital Editions". I am not that impressed by this flash based software! It doesn't give me a feeling of reading a eBook the regular way :)

Related Post: Foxit - the Best PDF Reader

Thursday, November 09, 2006

IndiMix '06 -- Webcast

In continuation to my previous post on IndiMix I got registered to the webcast couple of days back.

I have gone through the agenda of the seminar and was interested only in these two sessions.

1. The Genesis of Creativity and Innovation – The Next Web Now (10:15 AM to 11:15 AM)
Speaker: Steve Ballmer, CEO, Microsoft Corporation.

2. Designing for the Next Web Now – Experiences and Expressions that Capture the Imagination (2 PM to 3:15 PM).

Speaker:

i) Leon Brown, Regional Designer and User Experience Lead, Microsoft Corporation APAC (Singapore)
ii)
Pandurang Nayak, Developer Evangelist, Microsoft Corporation India
iii)
Deepak Gulati, Developer Evangelist, Microsoft Corporation India
iv)
Dax Pandhi, CEO, Nukeation Studios

Not that I wasn't interested in the other sessions. Since that I am in vacation and browsing from home, I wanted to keep my broadband usage to the minimum :)

Today Morning, when i tried accessing the URL (which they have sent me in the registration confirmation mailer) it was throwing an error "We're Sorry, the page you requested could not be found". I tried for almost 20 minutes but the result was the same.

Then Vinod Unny (MVP) sent me the URL to access the webcast (http://server1.msn.co.in/sp06/indimix/challenge.asp). By the time I logged into the site "Steve Ballmers" keynote was already over :(

I got to listen to the topic "Realizing the opportunity in India". Since that I have missed out Steve's keynote, I started to wait for Pandu / Deepak to "perform".

I have seen Deepak Gulati take sessions in TechEd's. Normally I don't miss his sessions. On the flip side, I know of Pandu for a long time but i have never seen him take a session on a huge event like this. So i was eagerly waiting to see him online :) They actually didn't disappoint me. Both of their demo's where great. I felt that Pandu was bit tensed during the session which was evident in his face :)

To be frank, I really don't understand the purpose of bringing in "Mandira Bedi" and "Murali Karthik" for this event!! Do you have a say on this?

Wednesday, November 08, 2006

I am one of the Microsoft BlogStar Winner :)

Wow I am really excited to know that i am one among the Top 20 Microsoft India BlogStar winners :) Check out http://www.microsoft.com/india/blogstars/winners.aspx

On the flipside, I am bit disappointed that I didn't make it to the Top 5 :( Because the Top 5 guys gets a chance to meet Steve Ballmer in Mumbai tomorrow.

But overall this Month seems to be a really good one to me :) Because starting from November 1st, 2006 till date i have got 3 awards/gifts (including this) from various technology related sites. The other two are:

1. I have won the first prize in community-credit website this month.
2. I have won the "Best Member Award" in dotnetspider.

Technorati tags: ,

Tuesday, November 07, 2006

Visual Studio 2005 Tools for Office Second Edition

Visual Studio 2005 Tools for the 2007 Microsoft Office System (Visual Studio 2005 Tools for Office Second Edition or VSTO 2005 SE for short) is a free add-on to Visual Studio 2005 that empowers developers to build applications targeting the 2007 Office system. Developers can now harness the benefits of the 2007 Office system platform and create Office-based solutions using the professional development environment of Visual Studio 2005.

Go through http://msdn.microsoft.com/office/tool/vsto/2005SE/default.aspx to know about the "Feature List", "Download links", "Demos - Video" and "Artiles" on this product.

SQL Server Health and History Tool

The Microsoft SQL Server Health and History Tool (SQLH2) allows you to collect information from instances of SQL Server, store this information, and run reports against the data in order to determine how SQL Server is being used.

This version of SQLH2 supports SQL Server 2005 RTM collection.SQLH2 collects four main types of information:

1. Feature Usage – What services/features are installed, running and level of workload on the service.
2. Configuration Settings – Machine, OS and SQL configuration settings, SQL instance and database metadata.
3. Uptime of the SQL Server service
4. Performance Counters (optional) – Used to determine performance trends

You can download this tool here

Technorati tags:

Saturday, November 04, 2006

SQL Server Hosting Toolkit Launched

The Database Publishing Wizard enables the deployment of SQL Server 2005 databases into a hosted environment on either a SQL Server 2000 or 2005 server. It generates a SQL script file which can be used to recreate the database in shared hosting environments where the only connectivity to a server is through a web-based control panel with a scripting window.

The Database Publishing Wizard is currently a command line tool, though future releases will add a graphical user interface.

1. Current Release
2. Command Line Interface
3. Supported Objects & Types
4. Known Issues
5. Roadmap for Future Releases
6. FAQ
7. Tutorial: Upload the TimeTracker ASP.NET starter kit to your hosted account

Source: codeplex

Technorati tags:

Friday, November 03, 2006

Won a prize (Best Member Award) in dotnetspider.com

Below is the extract of the mail which I got from Tony John of dotnetspider.com couple of days back.

Vadivel Mohanakrishnan,

We are glad to inform you that you are selected as the RapTier Gift winner for the month of October 2006 for your valuable contributions to dotnetspider.com. We are looking forward to see more contributions from you in future.

Winners list is here :: http://www.dotnetspider.com/gifts/Winners.aspx

Gift which I would be getting is called "RapTier Software".



RapTier is a template-based ASP.NET, C#, VB.NET and SQL code generator that creates robust database driven applications for a wide range of database. Features: ASP.NET, C#, VB.NET and SQL code generation; DataSet and Typed Data Transfer Object support; Stored Procedure and Dynamic SQL support; Existing stored procedure wrapping; Support of many database engines; Template-based design, WebForm UI generation, Database documentation generation

To know more about it, check this out http://www.sharppower.com/

Wednesday, November 01, 2006

Won the first prize in community credit ...

I am glad that few minutes back i came to know that I have won the 1st prize in Community-credit for the month of October 2006.



Below is the extract from the mail which I received from David (Admin of Community-credit site).

Hello Vadivel,

Congratulations. You won 1st Prize in the October 2006 Contest. As you may know, a lot of people competed, but only a few won. Nicely done! Please also keep in mind that your prize is not just a stupid, geeky gift, but it's also a symbol to represent your contributions to the development community. When folks ask you how you won, you can tell them that it was by helping out other geeks. Be proud and brag.

Check out this page to know about the Prize I got :: http://www.community-credit.com/CommunityCreditPrizePage.aspx and choose October 2006 in the dropdown.

Tuesday, October 31, 2006

Copying Database Diagram from DB server to another ...

For some, working with Databases is fun and for others, it could possibly be a nightmare. Understanding the structure of a Database schema from the available documentation could spell doom for many developers.

A picture is worth a thousand words. This golden saying applies aptly to Databases! Yes, often, a pictorial representation of a Database conveys more meaning about the Database Schema than the SQL scripts generated or other documentation that may be available. One of the first thing developers wants to do when starting work on a new project is to understand the Database schema. Sometimes, understanding a complex database schema can be simplified largely by a relationship diagram.

The Database diagram tools in the SQL Server enable administrators and developers to create Database diagrams very easily. This article demonstrates an efficient way to move DB diagrams from one Database to another.

We are going to use two sample Databases for this purpose. They are named PlayDB and PlayDBMirror for sake of simplicity and clarity. Let us create some sample tables and one sample diagram called “sample1” in the Database PlayDB. The objective is to copy the diagrams from this database to PlayDBMirror. Let us see how this can be accomplished.

Note: It is not necessarily that you need to create a DB called PlayDB in your server. You can also work with Northwind or Pubs database. The bottom line is don’t use the production server to test these out.

Run through the steps from 1 to 8:

1. Create a new Database by right clicking on "Databases" in Enterprise Manager and name it as "PlayDBMirror".

2. Now right click on your source Database (for me it is "PlayDB")

3. Choose "All Task" >> "Export Data" and choose the source and destination DB.

4. In "Specify Table Copy or Query screen" choose the second option "Use a query to specify the data to transfer" and click next.

5. In the next screen type the following query "select * from dtproperties" (Note: this would try and push ALL records from source dtproperties to destination dtproperties)

6. In "Select Source Tables and Views" click on the "Results" under heading destination, and then manually change it to dtproperties.

7. In the next screen choose "Run Immediately" check box and click on Next.

8. That’s it click on "Finish" on the last screen.

Now, we have successfully copied all the database diagrams from PlayDB to PlayDBMirror. Let us suppose we have somehow modified the database diagrams and want to move them again. Running through the steps from 1 to 8 will accomplish the task and lo here is the catch. If you check the diagrams section in the destination database (PlayDBMirror), we see the diagram sample1 copied successfully.

The internals

What happens internally is that the diagrams are stored in a table called dtproperties.

Each diagram internally has 7 rows in “dtproperties” with unique “objectid”. They are listed below for the purpose of completion:

1. DtgSchemaOBJECT
2. DtgSchemaGUID
3. DtgSchemaNAME
4. DtgDSRefBYTES
5. DtgDSRefDATA
6. DtgSchemaBYTES
7. DtgSchemaDATA

Excuse me, did I say unique objected?! Now here is where you need to pay attention. If this Database is queried, you will find that there are 14 rows!! Which means there are two diagrams now! How strange? The records were appended to the table during the second copy operation and were not overwritten! Why? And guess what, all of them have the same object id. Which means, for one diagram, there are 14 rows in the table dtproperties. In other words, there are two unique object Ids.

Let us analyze and figure what would happen if two diagrams have the same ID.

1. Double click on any one of the diagrams
2. Make some changes to it, save and close it.
3. Open the other diagram the same changes would reflect there also. Similarly if you drop one diagram the other one would also get delete. Believe me!! Rarely it has thrown some errors also for me in this step. And I have been left with no other alternative than to delete both diagrams.

The solution using Query analyzer

To move all the existing diagrams, we do this:

Insert into dtproperties
(
objectid,
property,
value,
uvalue,
lvalue,
version
)
Select
objectid,
property,
value,
uvalue,
lvalue,
version
From
PlayDB..dtproperties
Where
value not in (select value from dtproperties)


This query when run from the Query analyzer of the target Database ensures that all the diagrams are copied to the target Database but ensures that if the target Database already has a diagram with the same name, it doesn’t get copied.

Let us try to copy selected diagrams from the source to the target Database.

First, to decide on the diagrams that you want to copy, run the query

Select objectid, property, value from playdb..dtproperties

In the output of this query look in the "property" column for "DtgSchemaNAME" value. The DtgSchemaNAME property would have the name of the diagram; use this in conjunction with the objectid column to locate the diagram you would like to transfer.

We should also check the dtproperties table of destination Database before transfer. Using the above query, check that the destination dtproperties does not already have the objectid, which you are going to transfer now. If it does, get the maximum id from the destination table and use it within your select statement. Use that max value in the below query to be 100% sure its unique within the dtproperties table in the destination Database. For the sake of discussion let us use an arbitrary number 100 as my objectid.

Insert into dtproperties --- this is the table in the destination db.
(
objectid,
property,
value,
uvalue,
lvalue,
version
)
Select
100, -- Object ID
property,
value,
uvalue,
lvalue,
version
From
PlayDB..dtproperties
Where
objectid = 1
--- Replace this number with the object ID of the diagram which you want to transfer

Conclusion

Backing up the source database and restoring it in the destination database can accomplish the same task. But the crux of the matter is copying specific diagrams from one Database to another. Happy programming!

Technorati tags: ,

Saturday, October 28, 2006

One another good ASP.NET FAQs

Check out syncfusion.com

It has 300+ ASP.NET related Frequently asked questions. It would be a good read.

Technorati tags:

Friday, October 27, 2006

Do you think I will make it?

Approximately 20-25 days back I came to know about Community-Credit site and I started participating in it actively. See the below screenshot to know where I stand.


FYI, Lorenzo Barbieri is maintaining his lead from day 1. For the past 10+ days I am maintaining the second position and just 4 days left to know who is the winner of this month. Do you think I can make it to the top?

Technorati tags:

Wednesday, October 25, 2006

Listing / Dropping ALL stored procedures from a database in SQL Server 2005

This was the question asked by a member here http://www.dotnetspider.com/qa/ViewQuestion.aspx?QuestionId=59232

Though I have answered there itself, thought I would document it here also for future reference.

I) List all Stored procedures in a database:

Select * from sys.procedures where [type] = 'P' and is_ms_shipped = 0 and [name] not like 'sp[_]%diagram%'

OR

Select * from sys.objects where type='p' and is_ms_shipped=0 and [name] not like 'sp[_]%diagram%'

Please note that I have added the 'NOT LIKE' part to get rid of stored procedures created during database installation.

II) Delete all [user created] Stored procedures in a database:

Select 'Drop Procedure ' + name from sys.procedures Where [type] = 'P' and is_ms_shipped = 0 and [name] not like 'sp[_]%diagram%'

This would list down all the user defined Stored procedure as 'Drop Procedure procedureName' in the result pane. Just copy-paste it into the query window and execute it once.

Technorati tags:

Sunday, October 22, 2006

Brief Theoretical Knowledge about Table Partitioning

1. Partitioning is fully depended on Filegroups.

2. Start table partitioning as early as possible. So that huge data movement can be avoided later.

3. A file would be grouped to one and only Filegroups. Actually Filegroups improves database maintenance, performance and Online restore.

4. For your information, default Filegroup is always “primary filegroup“. If at all you don’t believe me! Create a dummy table something like this,

Create Table Test
(
sno int
)
Go

Now in your SQL Mgmt Studio, right click on that table and check the properties yourself.

5. All “System Objects” have to belong to PRIMARY Filegroup.

6. It’s recommended, to have all “User Objects” created in another Filegroup.

7. In SQL Server 2000, any user object can’t belong to more than one Filegroup. But it’s possible now in SQL Server 2005. That is, a table data can be partitioned and stored into multiple Filegroups.

8. If you have two large tables, put them into different Filegroups. Filegroups are recommended being on two different “Physical” drives.

Physical drive (hard disk) – has a spindle which has “reads per second”, “writes per second” which in turn determines the performance. While joining these 2 tables, parallel retrieval happens based on their spindle speed (since it involves two different physical drives). So there would be a boost in performance.

9. Filegroups are database specific. That is, if you create a “Filegroup1” for Database1 it won’t be available for any other databases.

10. Tables and Indexes can reside on different Filegroups.

11. Partition can be created based on the “Partition key”. Typically it would be based on the
“Date” field.

12. People querying the table need not bother about fetching records from different partition. Because SQL Engine takes care of it internally.

13. You can create separate partition for storing historic data.

14. Maintenance plan needs to be created based on the Filegroup.

15. By the way, we can set Filegroup as Read-only also. Lookup tables or tables which don’t go to change at all needs to be placed in Read-only groups. So it’s enough if we backup this Filegroup once.

16. In SQL Server 2000, we need to “Restore” all Filegroups then only we can make the database online. In SQL Server 2005, once the Primary Filegroup is up we can bring the database online. So functionality wise grouping should be done in a Filegroup.

In the next post I would talk about the steps involved in creating a partition and the different types of operations in a partition.

Technorati tags: ,

Tuesday, October 17, 2006

Accessing a table in SQL Server 2005 (Schema related)

Recently in a discussion forum I came across this question "I have a small Doubt in SQL server. The database name is 'AdventureWorks' and the table name is 'Production.Culture'. What is the query to fetch records from that table."

The answer to this is :

Select cultureid, [name], modifiedDate from AdventureWorks.Production.Culture

There was another member of the forum who was saying that we need to only use the below query:

Select * from AdventureWorks.dbo.Production.Culture

Please note that he has mentioned two Schema names (dbo and Production) in a single query. This query wouldn't work and I thought I would explain the concept of "Schemas" in SQL Server 2005 to him.

In short,

a) Schema is similar to Namespaces in .NET
b) It helps in logical grouping of tables.
c) To view all existing schema in a database, for example within AdventureWorksDB ::

1. Open ur SQL Mgmt studio
2. Go to AdventureWorks DB and expand it
3. Expand the Tables

4. Now just observe what you see within that. There would be,

i) few tables which starts with "dbo.",
ii) few tables which starts with "Person."
iii) few tables which starts with "Production." etc.,

Those are all called as "Schema Names".

Other easier way to find this is to, navigate to the "Security" tab of the Database. There we will find a folder by name "Schema". Expand that to see all the Schema names related to that DB.

5. Now click on "New query" button and try this

--- This would work
Select * from person.address

6. Now try this

--- This wouldn't work
Select * from dbo.person.address

7. Just for your confirmation, navigate to someother database in ur query window. For example,
Use Master
Go

--- This would work
Select * from adventureworks.person.address

--- This would work
Select * from adventureworks.dbo.AWBuildVersion

--- This would FAIL
Select * from adventureworks.dbo.person.address

8) That said, by DEFAULT when we create any new table it would be assigned to "dbo." schema.

This you can verify by going to ur SQL Mgmt Studio again, navigate to any database of your choice and create a sample table. Now expand the "Tables" folder to see "dbo.YourTableName" (the table which you have created now).

So the bottomline is when accessing a table name in SQL Server 2005, it shld be done like this:

ServerName.DatabaseName.SchemaName.TableName


Please note that if we want we can create our own schema and then assign the tables to it. Also when a new user is being created we can assign the default schema to them. This merits a separate discussion by itself so would write on it sometime soon.

Technorati tags:

Monday, October 16, 2006

Find out the Second (2nd) Highest Salary ...

For past 15 days I am actively participating in the discussion forums of dotnetspider. Yesterday there was a question relating to SQL Server which seems to be asked in interviews very often. i.e., How to find out the 2nd highest salary of an employee?

Thought I would give out an sample for those who are yet to find an answer for this. For that purpose I have given a table structure with few sample records inserted to it.

-- Table Structure
Create table employee
(
[Name] varchar(20),
Sal int
)
Go

-- Populate Sample Records
Insert into employee values ('Vadivel', 80000)
Insert into employee values ('Vel', 70000)
Insert into employee values ('xxx', 40000)
Insert into employee values ('Sneha', 60000)

The Solution:

Select top 1 [Name], Sal from
(
Select Top 2 [Name], Sal from employee order by Sal Desc
) AS A1 Order by Sal


Technorati tags: , , ,

Saturday, October 14, 2006

Desi Search Engine ...

Guruji.com - India's local search engine! has been launched in Bangalore on 12th of this month. The site seems to be focused on Indian consumers. May be they wanna become Google of India!

I tried out few searches - the results and performance are really impressive. The search strings which i tried are as follows:

1. Pizza Adyar
2. Verizon Guindy

Quite a few of my search strings also failed. Its understandable as they are still in Beta! But I feel its definitely a good start.

Guys use it and share your experience ....

If you like to know more about the Investors, Management Team etc., check out this.

Extract from TechTree:

Gaurav Mishra, co-founder and coo, guruji.com explains, "90% of Internet search queries are local in nature, and guruji.com will deliver better search results than any other search engine in these instances. For example, if a user types a search "Pizza in Koramangala, Bangalore " or "Chinese restaurant Juhu, Mumbai" the user will be able to see local business listings as well as articles, reviews, blogs, or any other web references."

Extract from cyberabad.biz

Two Indian Institute of Technology (IIT) Delhi graduates have returned from the Silicon Valley to launch a home-grown search engine with loads of Indian content.

In a bid to offer an alternative search engine and wean away the growing community of Internet buffs from established search engines like Google and Yahoo! Anurag Dod and Gaurav Mishra have co-founded www.guruji.com with a $7-million (Rs.322 million) seed fund from Sequoia Capital India, a venture capital firm.

Technorati tags: , ,

Monday, October 09, 2006

Google I'm Feeling Lucky :)

From a fellow MVP (Harish Ranganathan) I came to know about this interesting fact.

Old News: If you search for "Failure" in Google and click on "I'm feeling lucky" button it would take you to President Bush's homepage :)

Latest News: Now, interestingly, if you search for the word "Search" in Google and click on "I'm feeling lucky" button, it goes to Windows Live Search Page.

Technorati tags: , ,

Saturday, October 07, 2006

Tip to prepare for interview -- Part II

This post is written having final year students in mind.

The first part of this series! is here. So if at all you haven't read it before, I suggest you have a look at it also.

Apart from your regular preparation for the final year exams and projects, it’s always better to do the following:

1. Spend time on group discussions
2. Work on Quantitative aptitude and reasoning questions again and again. Some of the famous authors in this topic are listed below for your easy reference.
a. George Summers
b. Shakuntala Devi
c. R.S. Agarwal
d. P.S. Agarwal
e. Barrons GRE

3. Spend time on websites to know more about Interviews. For example:
www.freshersworld.com

4. If you are informed by the company for an interview, collect data regarding the company (visit their website first without fail)

5. Prepare a write-up on your own vernacular. Try to translate it. Practice it in-front of your friends or your lectures or mirror and fine tune it.

6. Prepare a neat CV.

Interview

Telephonic:

1. Provide the contact number which is reachable. Be prepared before an hour. Keep the CV with you at the time of interview.

2. Attend the interview in a calm atmosphere. If you are not comfortable at particular juncture, please request them to call back after some specified time.

3. Start with a pleasant note. Say “Hi” with the name of the person after hearing the name or just say “Hi”.

4. During conversation, listen carefully and start answering accordingly. Understand the question properly, before answering. If you come across any confusion in the conversation, hold on and explain what you have understood from the question and then continue.

5. Your conversation should be free-flowing and confident.

6. Be specific in your conversation.

7. Before concluding feel free to ask your doubts, questions and end with a pleasant note.

8. Do not overdo :)

In Person

1. Be present at the interview venue before half an hour.
2. Present yourself with formals.
3. Have a copy of Your CV and other testimonials
4. Look confident and relaxed
5. Your conversation should be free-flowing and confident.
6. Be specific in your conversation.

Observation by the panel members

Broadly there are three categories:

1. Poor communication skills
2. Attitude issues. Like arrogant mannerism etc, Lethargic attitude.
3. Fake project / company details.

The first two things you need to work on and improve yourself. The third point is something which you need to avoid religiously.

Few standard questions which you can expect in any interviews are:

1. Tell me about yourself!
2. What are your strength and weakness?

3. Brief on the projects you last worked on OR Explain the project you created during your Final year.

Points normally observed are: What role the candidate has played, whether he/she is a team member, did he/she initiated it, is he/she well informed, is he/she able to quantify it.

4. Try to quantify you work.

Points to note:

1. Please be aware of when and tom whom to do your negotiations.

2. Avoid showing fake projects, companies in your CV. As most of the major companies are doing background verification of a candidate.

3. Don’t ever crib about your previous employee / manager / team member etc., This would give a bad impression about you.

After Joining

1. Be participative
2. Be proactive
3. Be a team player
4. Start documenting whatever you do.
5. Write simple and clear mail to your team leader and manager.
6. Understand the project you are in. If you don’t understand try to refer the documents related to the project.

I would be updating this post on a periodic basis. I welcome suggestions and points to be added here from all you guys!

Technorati tags: , ,

Friday, October 06, 2006

Finding the File Size of Attachment in Javascript

<html>
<head>
<title>Check the Size of a File -- Javascript</title>

<script language="JavaScript">
function CheckFileSize()
{
var objSize = new ActiveXObject("Scripting.FileSystemObject");
var strFileName = objSize.getFile(document.frmFileUpload.strFile.value);
var SizeOfFile = strFileName.size;
alert(SizeOfFile + " bytes");
}
</script>
</head>
<body>
<form name="frmFileUpload">
<input type="file" name="strFile">
<input type="button" name="btnSize" value="Show file size" onClick="CheckFileSize();">
</form>
</body>
</html>


The above code snippet will work only if ActiveX is supported @ the client end.

Monday, October 02, 2006

Paging in SQL Server 2000 and 2005?

Abstract

In this article I have discussed in detail about the easier way of paging records using SQL Server 2005. In the due course I have introduced the nuances of using TOP keyword and the cool enhancements added to it. Along with that I would show you how to use TOP command to do paging in SQL Server 2000, albeit at a performance loss, then show how the same result can be achieved with higher performance thanks to new features in SQL Server 2005.

Introduction

Databases products always attract me for many reasons. One simple reason is, .NET framework 1.0 was released and then in a short time its “version 1.1” has been released. If at all you are watching the market you would know by this time, few months back .NET Framework 2.0 has also been released. So what’s the point I am trying to make? The application layer or the front end layer changes very frequently and we need to be updating our self on a daily basis. That’s the pace of the technology either keep up or be left out :)

On the flip side, if we take SQL Server a version was released on 2000 then you had 5 years gap before another major version was released. This gives enough time to master a product and appreciate its advantages. Approximately 6 months back SQL Server 2005 was released (Previously it was code named as Yukon).

Paging is one of the very common features expected in any application. It can either be done at the “Application level” or at the “Database level”. In this article I am concentrating on “Database level Paging”.

Though paging can be done in SQL Server 2000 itself there are performance issues with it. Like, either we need to use Temporary Tables OR Dynamic queries in order to do paging in SQL Server 2000 which by itself is a drawback. I have shown the way I used to do paging in the past with fully functional stored procedure. Subsequently I have discussed about the way to avoid “Temporary tables and Dynamic queries” in the process of paging records in SQL Server 2005.

In short, I have shown the way to do paging using the following new features introduced in SQL Server 2005:

1. Enhancements to TOP keyword
2. Row_number() function
3. CTE – Common Table Expression

Enhancements to Top Keyword:

TOP keyword was very much available in the previous versions of SQL Server itself. If at all one needs to appreciate the enhancements they need to first understand the limitations of the TOP keyword in SQL Server 2000. So let me walk you all through it.

In T-SQL TOP keyword is used to limit the number of rows or list fixed number of rows from the top. For example,

// this would list top 5 records from the specified table.
Select Top 5 FieldName from TableName

What if I would know the Top “N” only during runtime? Is there a way to dynamically pass the number to the TOP keyword? The answer is Big NO.

The work around which people used to do is make use of “Set RowCount On” which accepts parameter / variable. The downside to it is, it affects every query in the current connection.

Example:
Set Rowcount @LimitRowsSelect fieldname from tablename
Set Rowcount 0


The last statement in the above example is used to return SQL Server to normal state. If that line is skipped all future code snippets would also return only @LimitRows records. Hope I have made the limitation clear to you. Now let’s start looking at the enhancement done to this keyword in SQL Server 2005.

1. Now “TOP” keyword accepts parameters
2. Now “TOP” keyword can be used in Update statements as well
3. Now a T-SQL query can be passed as a parameter to “TOP” keyword.

Database Design

For the rest of this article the queries which I write would be based on a user defined table called “Sales”. Let me give you the scripts for the same with some sample data to populate it. This would surely help you to test the code then and there for better understanding.

I refrained from writing my scripts around either Northwind or Pubs database because those databases, by default, don’t come along with SQL Server 2005. The work around is to download the scripts of those 2 database from Microsoft site but its size is 1.5 MB. Downloading it just for testing the scripts in this article would be asking for too much J that’s the reason I created a sample table and worked on it.

//table structure
Create table Sales
(
Sno int identity,
YearOfSales Int,
SalesQuarter Int,
Amount money
)
go

//test records to populate the sales table
Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2004, 1, 100)
Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2004, 2, 200)
Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2004, 3, 300)
Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2004, 4, 400)
go

Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2005, 1, 500)
Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2005, 2, 600)
Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2005, 3, 700)
Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2005, 4, 800)
go

Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2006, 1, 900)
Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2006, 2, 1000)
Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2006, 3, 1100)
Insert into Sales (YearOfSales, SalesQuarter, Amount) Values (2006, 4, 1200)
go

Enhancements to TOP Keyword – Explained with code snippets

The below code snippet would work in both SQL Server 2000 and SQL Server 2005

/* TSQL stops processing when they have been affected by the specified no of rows */
Set RowCount 2

-- SalesQuarter of first two records would be set as 999
Begin Tran
Update Sales SET SalesQuarter = 999
If @@RowCount = 0
Begin
Print 'No Record found'
Set RowCount 0
End
Commit Tran

To list all the records again we need to set the RowCount as 0.

Set RowCount 0
Select * from Sales

The below code snippet would work only in SQL Server 2005.

As you could see Set Rowcount statement is not used at all. Instead we have made use of TOP keyword.

Begin Tran
Update TOP (2) Sales SET SalesQuarter = 999
If @@RowCount = 0
Begin
Print 'No Record found'
Set RowCount 0
End
Commit Tran

Passing parameters to the TOP keyword

One of the features which were missing in SQL Server 2000 was passing parameters to the TOP keyword. That is now possible in SQL Server 2005.

Declare @topValue Int
Set @topValue = 2
Select Top (@topValue) * from Sales

Even if you give decimal Values it takes only the integer part and ignores the decimal part. The below query would list the Top 3 records.

Declare @topValue Int
Set @topValue = 3.9
Select Top (@topValue) * from Sales


Listing Top n percentage of records from a table

Declare @topValue Int
Set @topValue = 25
Select TOP (@topValue) Percent * from Sales

Out of the entire stuff the coolest enhancement is we could now pass a query as a parameter to TOP keyword. The below code snippet helps us in finding out the top half of the existing records.
-- It would list 50% of the records
Select TOP (Select cast((Select Count(*)/2 from Sales ) as int)) * from Sales

Hope that you would have got the feel of what’s there in store as far as TOP keyword is concerned in SQL Server 2005.

In a nutshell, TOP keyword in SQL Server 2005 can be used for the following:
1. It can be used to specify numbers or percent of rows to be returned
2. It can be used along with Insert, Update and Delete statements
3. Now we can pass expressions to it.

Now, let me explain the way in which I used to do paging while working with SQL Server 2000.

Paging explained in SQL Server 2000

There are two ways; either we need to make use of Temporary tables or dynamic queries for doing paging in SQL Server 2000. In SQL Server 2000, the maximum length of dynamic SQL Statement is limited to the maximum size of an nvarchar field. That is, max can be 4000 characters. Since I want to keep the sample as simple as possible and as I am sure in this example we won't need more than 1000 characters I have declared @strQuery as nvarchar (1000).

Create Procedure dbo.testPaging_SalesTable
@intNumOfRecords int,
@intPageNum int
As

/*
Stored Procedure: dbo.testingPaging_SalesTable
Creation Date: 06/20/2006
Written by: Vadivel Mohanakrishnan

Purpose : Helps in paging records within Sales table by just passing two variables.

Testing :
Exec dbo.testPaging_SalesTable 5,1
Exec dbo.testPaging_SalesTable 5,2
*/

Declare @strQuery nvarchar(1000)
Declare @intTotal int
Set @intTotal = @intNumOfRecords * @intPageNum

Set @strQuery = 'Select TOP ' + Cast(@intNumOfRecords as Varchar(5)) + ' * From Sales'
Set @strQuery = @strQuery + ' Where Sno Not in (Select TOP ' + Cast(@intTotal as Varchar(5))
Set @strQuery = @strQuery + ' Sno From Sales Order By Sno ) '

--Lets execute the dynamically built query
Exec sp_executesql @strQuery

Go

If I execute the stored procedure as Exec dbo.testPaging_SalesTable 5, 0 it means give me 5 records for page 1. Similarly if pass parameters as 5, 1 then it means give me 5 records for page 2. Hope till this part it’s clear for you!

Normally dynamic queries are something which one would prefer to avoid for performance reasons. Discussion in this topic is totally out of context so let me move on to show you all the way to implemented paging in SQL Server 2005.

Paging explained in SQL Server 2005
In SQL Server 2005, a new function by name “Row_Number()” has been introduced. This function helps us in returning the running count of all the rows in the specified or current scope.
Method 1:

With the help of Row_Number() function we can get the running count of all the rows and by using the between clause we can filter the records as per our need.

With SQLPaging As
(
Select Row_number() Over (Order by Sno) as RunningNumber, YearOfSales, SalesQuarter, Amount
From Sales
)
Select * from SQLPaging
Where RunningNumber between 2 and 3


Here in this method I have introduced you to the new concept called “Common Table Expression” shortly called as CTE.

CTE is an expression that returns temporary named result set from a query. Find below a very basic explanation for the above code snippet.

1. In the WITH clause we need to defines the table(s) and column(s). In our case the name of the CTE is SQLPaging.
2. Within the WITH clause is where we need to write our query which would return the expected result set.
3. Outside of the WITH clause is where we write a SELECT statement that references the SQLPaging table.

If you look at the above syntax you can draw similarity with Views. The only difference is Views are permanent in the database as they are created as a database object; whereas CTE aren’t created as a database object.

Method 2:

This method is an extension of the above method. I would pass page size and page number as input and then with the help of Row_Number() function filter out data pertaining to that particular page.

Create Proc dbo.uspFetchPagedData
(
@pPageSize as int,
@pPageNum int
)
As

/*
Stored Procedure: dbo.testingPaging_AuthorsTable
Creation Date: 06/22/2006
Written by: Vadivel Mohanakrishnan

Purpose : Paging using Row_Number and CTE in SQL Server 2005.

Testing :
Exec dbo.uspFetchPagedData 5,1
Exec dbo.uspFetchPagedData 5,2

*/

With SQLPaging2
As
(
Select Top(@pPageSize * @pPageNum) Row_number() Over (Order by Sno) as RunningNumber,
YearOfSales, SalesQuarter, Amount
From Sales
)
Select * from SQLPaging2 where RunningNumber > ((@pPageNum - 1) * @pPageSize)


Now that the stored procedure is also completed you can go ahead and test it yourself. Moreover the logic in the above code is pretty straight forward and I haven’t done any rocket science here :)

Conclusion

Hope I have made the point across.Let me recap what I have discussed in this article. I have explained the way in which Paging was done in SQL Server 2000 and the way we could do the same in SQL Server 2005. In that due course I introduced you all to the enhancements made to TOP keyword and about the new CTE and Row_Number() function in SQL Server 2005.

Happy programming!

Technorati tags: , ,

Saturday, September 30, 2006

History of SQL Server ...

I suggest all SQL developers to have a look at http://SQLKnowledge.com. The site is rich with information on SQL Server and its really informative.

I read the history of SQL Server from that site and here is the summary of it:

1. 1993 - SQL Server 4.21 for Windows NT
2. 1995 - SQL Server 6.0, codenamed SQL95
3. 1996 - SQL Server 6.5, codenamed Hydra
4. 1999 - SQL Server 7.0, codenamed Sphinx
5. 1999 - SQL Server 7.0 OLAP, codenamed Plato
6. 2000 - SQL Server 2000 32-bit, codenamed Shiloh
7. 2003 - SQL Server 2000 64-bit, codenamed Liberty
8. 2005 - SQL Server 2005, codenamed Yukon
9. Next release - codenamed Katmai (not confirmed )

Technorati tags: , , ,