Saturday, December 31, 2005

Deprecation event category - Follow up

As today is a boring day (until tonight of course) I tested the Deprecation event in Profiler.

I know Microsoft has done an amazing effort in getting error descriptions as clear as possible and they succeeded quite nice but this is amazing.

I tried a DBCC SHOWCONTIG which will be replaced and this is what I see in Profiler:

DBCC SHOWCONTIG will be removed in a future version of SQL Server. Avoid
using this feature in new development work, and plan to modify applications
that
currently use it. Use sys.dm_db_index_physical_stats instead.


With sp_lock I get the following:



sp_lock will be removed in a future version of SQL Server. Avoid using this
feature in new development work, and plan to modify applications that currently
use it.

The use of more than two-part column names will be removed in a future
version of SQL Server. Avoid using this feature in new development work, and
plan to modify applications that currently use it.



How clear is that?

3..2..1..1

Because of a corrective measure 2005 will be one second longer... enjoy it :-)

Anyway, may I wish you a very happy 2006 where all your dreams come true.
May it also be a year of many SQL Server 2005 implementations in your environment ;-)

Thursday, December 29, 2005

Reporting Services Timeout through SOAP

We have report that is quite large and causes timeouts for one of our offices not surprisingly the one with the most transactions.

So I started looking for the timeout configuration values and I must say Reporting Services has quite a collection of timeout values :-)

We do a nightly generation of PDF files through the means of the RS webservice and it was the call to the webservice that timed out. The behavior wasn't influences by any of the configuration values of Reporting Services. So I started digging deeper and came across the following solution.

Add base.Timeout = 3600 (in milliseconds) to the constructor of the ReportingServiceProxy class.

Troubleshooting Performance Problems in SQL Server 2005

Another great whitepaper by Microsoft.
Find it here

Wednesday, December 28, 2005

Deprecation event category

Except for the unbelievable blocking process monitoring and the amazing deadlock detection in the new SQL Server 2005 profiler I stumbled upon another nifty feature.

It's the Deprecation event. This event is triggered on statements that will no longer be supported in future versions of SQL Server.

I think it's quite funky that Microsoft includes this in Profiler.
I haven't played around with it yet but as soon as I find some time I'm sure going to.

Tuesday, December 27, 2005

OPTION FAST vs CONVERT

I was playing around with the OPTION FAST statement for an update query when I saw a 'weird' behavior.

When using the OPTION FAST SQL Server passed the criteria as a fixed value while without it, it would use a variable because of a conversion.

Then I remembered the following article which states a list of situations where auto-parameterization is not happening and one of it is:

A statement with query hints specified using the OPTION clause.

I was passing a uniqueidentifier between single quotes so apparently the auto-parameterization is storing it as a character variable in the cache.

Another mystery solved :-)

Playing around with DBREINDEX/CREATE INDEX

There are many ways to rebuild your indexes in SQL Server 2000 and I felt like doing some performance testing on the different ways.

A DROP and CREATE is obviously the slowest because you are rebuilding your non clustered indexes twice this way. First he has to change the bookmark to the RID because it is becoming a heap instead of a clustered table and then he has to change the bookmark again to point to the clustering key.

CREATE WITH DROP_EXISTING and DBCC DBREINDEX rebuild the clustered index in one transaction preventing the double rebuild of the nonclustered indexes. When you are rebuilding a unique clustered index the nonclustered indexes aren't being rebuilt at all. The same goes for the CREATE WITH DROP_EXISTING if you are recreating the index on the same key.

Keep in mind that creating a clustered index without specifying the UNIQUE keyword gives the opportunity to insert non unique records. When this happens SQL Server adds 4 extra bytes to every key to guarantee the uniqueness. Another side effect is that nonclustered indexes are being rebuilt because the 'uniqueifier' is regenerated every time you rebuild the index and thus changing the clustering key.

One of the 'benefits' of the DROP_EXISTING method vs the DBCC DBREINDEX is the ability to pass the SORT_IN_TEMPDB option. This forces SQL Server to store the intermediate sort results that are used to build the index in tempdb. Although this may increase the index rebuild performance it does take more diskspace.

SQL Server 2005 introduces the ALTER INDEX statement with a REBUILD option.

Here is a little test script (always recreate the table when testing a reindexing method).
There is no time for the ALTER INDEX because I was testing on SQL Server 2000. But it should be about the same as DBCC DBREINDEX as they are supposed the be equivalents.

SET NOCOUNT ON
IF OBJECT_ID('myTest') IS NOT NULL
DROP TABLE myTest
GO
CREATE TABLE myTest
(myID int,
myChar char(512)
)
GO
DECLARE @i int
SET @i = 0
WHILE @i < 1000000
BEGIN
INSERT INTO myTest (myID, myChar)
VALUES (@i, REPLICATE('Z', 512))
SET @i = @i + 1
END
GO
CREATE UNIQUE CLUSTERED INDEX ci1 ON myTest (myID)
GO
--Create a ridiculous amount of indexes :)
CREATE NONCLUSTERED INDEX ix1 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix2 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix3 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix4 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix5 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix6 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix7 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix8 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix9 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix10 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix11 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix12 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix13 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix14 ON myTest (myChar)
GO
CREATE NONCLUSTERED INDEX ix15 ON myTest (myChar)
GO

--Drop and create - 00:26:39
DECLARE @Start datetime
SET @Start = Getdate()
DROP INDEX myTest.ci1
CREATE UNIQUE CLUSTERED INDEX ci1 ON myTest (myID)

PRINT CONVERT(varchar,GetDate() - @Start , 108)
--Create with DROP_EXISTING - 00:01:23
DECLARE @Start datetime
SET @Start = Getdate()
CREATE UNIQUE CLUSTERED INDEX ci1 ON myTest (myID) WITH DROP_EXISTING

PRINT CONVERT(varchar,GetDate() - @Start , 108)
--DBREINDEX - 00:01:16
DECLARE @Start datetime
SET @Start = Getdate()
DBCC DBREINDEX(myTest, ci1, 0)
PRINT CONVERT(varchar,GetDate() - @Start , 108)
--ALTER INDEX
DECLARE @Start datetime
SET @Start = Getdate()
ALTER INDEX ci1 ON myTest
REBUILD;
PRINT CONVERT(varchar,GetDate() - @Start , 108)
--Clean up
IF OBJECT_ID('myTest') IS NOT NULL
DROP TABLE myTest
GO

Thursday, December 22, 2005

Connectivity problems with SQL Server 2005?

Check out this great series of posts

DELETE WITH TABLOCKX - Follow up

After some playing around with DBCC PAGE, DBCC EXTENTINFO I came to the conclusion that it is indeed the deallocation that causes the process to take page locks (and extent locks when a complete extent is released).

From Inside SQL Server 2000

When the last row is deleted from a data page, the entire page is deallocated. (If the page is the only one remaining in the table, it isn't deallocated. A table always contains at least one page, even if it's empty.) This also results in the deletion of the row in the index page that pointed to the old data page. Index pages are deallocated if an index row is deleted (which, again, might occur as part of a delete/insert update strategy), leaving only one entry in the index page. That entry is moved to its neighboring page, and then the empty page is deallocated.

So I suppose the spid that does the deletes also deallocates the pages and extents and not the asynchronous process that scans for ghosted records

Wednesday, December 21, 2005

SQL Server 2005 system tables diagram

I read on SQL Server Code about this great system tables diagram.

Download the system tables diagram for SQL Server 2005

DELETE WITH TABLOCKX

Recently I was playing around with one of our procedures that we use for our reporting solution. One of the requirements was the ability to regenerate data for a certain date. In order to guarantee this we delete the records for a specific day before filling the denormalized tables with data. Because there are a lot of records for 1 day for some reports and we only fill them during the night when there is no activity I thought it would be a good idea to use WITH TABLOCKX because the finer the granularity the more work SQL Server has with managing the locks.

I accidentally came across some strange behavior (imho) when checking the actual locks taken by the process. It seems SQL Server is switching to row and page locks and escalating to table locks at some point even though I hardcoded WITH (TABLOCKX) in the DELETE statement.

Here you can find a little reproduction script:

The funny thing is that the switch to page and row locks is only happening as of a certain number of records (possibly because of the time it takes so some process finishes?)

SET NOCOUNT ON

IF NOT OBJECT_ID('tbltablockTest') IS NULL
DROP TABLE tblTablockTest
GO

CREATE TABLE tblTablockTest
(ID int,
myDate smalldatetime,
myText varchar(1000)
)

GO

DECLARE @i int

SET @i = 1

WHILE @i < 1025
BEGIN
INSERT INTO tblTablockTest
VALUES (@i, getdate(), REPLICATE(CAST(@i as varchar), 1000 / LEN(@i)))

SET @i = @i + 1
END

GO

CREATE CLUSTERED INDEX ixID ON tblTablockTest (ID)
GO


DBCC TRACEON (3604, 1200) --1200 shows detailed information about locking
BEGIN TRAN

DELETE FROM tblTablockTest WITH (TABLOCKX)
WHERE ID BETWEEN 1 AND 15 -- increase the 15 if the behaviour is not showing

ROLLBACK
DBCC TRACEOFF (3604, 1200)


I haven't figured out why this is happening yet but I'm on a mission ;-) One of my guesses is that it has something to do with page deallocation but I haven't got a good explanation yet.

Tuesday, December 20, 2005

SQL Trivia

Got this nice little pop quiz at work (which you are obviously bound to answer wrong).

Given the next SQL Script:



What would the output of the SELECT be?
Now my first reaction was obviously 3, 2, 1 and this would be correct on SQL Server 2005 but unfortunately SQL Server 2000 gives you 1, 2, 3.

Quite frankly I prefer the SQL Server 2005 method anyway ;-)

I lost :-(

Overwrite PDF with Reporting Services through SOAP

I had this really strange issue with one of my reports in development. Obviously developing a report is not something you do in one run. Because of this nice bug in Reporting Services I am only able to check the output by generating the PDF through the Reporting Services webservice.

I was getting this funny behavior of PDF files being correct in size and even page numbers but all the pages were empty. When you eventually scrolled through the PDF it would crash saying that the document is invalid.

Apparently the problem is being caused by overwriting the PDF from the .NET application. Deleting the file before creating it seems to solve this issue.

*EDIT*
Aha, found the nasty bugger :-(

FileStream fs = new FileStream(path, FileMode.OpenOrCreate);
instead of
FileStream fs = new FileStream(path, FileMode.Create);

Developers *sigh* :-D

Monday, December 19, 2005

SQL Server Management Studio - Faster Start

Check out this post for more information about the command line parameters for SSMS.

Adding the nosplash parameter really speeds up starting Management Studio. Just add -nosplash to your SSMS shortcut.

How's that for an easy gain!

Thursday, December 15, 2005

MSDN Product Feedback Center

As you can see in a post from yesterday there is a nice website that enables you to send suggestions to Microsoft.

Check it out here

Wednesday, December 14, 2005

Those darn statistics

Recently I had this funny query which ignored and index although it seemed very useful to me. Normally I don't question the query optimizer too much but this time I was really sure.

I tried a UPDATE STATISTICS WITH FULLSCAN on the index and all of the sudden my query started to use the index.

Statistics on indexes are always created with fullscan because SQL Server has to scan all of the data anyway. This counts for newly created indexes and reindexing. As far as I can see auto statistics update is always performed by sampling the index or table.

Time for some more testing tomorrow! :-)

Product Suggestions

I got around to submit a product suggestion to Microsoft (for SQL Server 2005).

I loved the shortcut to 'Manage Indexes' from the query plan to create some temporary indexes for performance testing. It was a very handy feature that made performance tuning a bit easier. I've submitted a suggestion here (go vote!)

I'm also following 2 other suggestions. One being the complete 'Script As' script with DROP and CREATE statements, check it out here.

Finally there is the very unhandy 'feature' of not being able to load scripts in the current connection and database context, check it out here.

It seems I'm a bit conservative and want isqlw features back :-)

Thursday, December 08, 2005

SQL Server 2005 BOL Update

Wow, Microsoft already released an update of the samples and books online for SQL Server 2005. That's very nice, keep up the good work!

They are already working on the next version btw :)

SQL Server 2005 BOL
SQL Server 2005 Samples

Sunday, December 04, 2005

An unexpected error occurred in Report Processing. (rsUnexpectedError) - Follow up

In this post I had a little problem with Reporting Services.
I didn't have time to investigate it a little deeper until a couple of weeks ago. The problem is that there is no real solution for the time being. Even SQL Server 2005 Reporting Services experiences the same issue.

So I finally decided to submit it to Microsoft Support. They have confirmed that it is a bug in Reporting Services and the developers are looking into it. Let's see how they fix it ;-) I'll keep you posted of course.

Wednesday, November 30, 2005

New System Objects in SQL Server 2005

One of the first and hardest things to achieve when a new version comes out it get rid of old habits. I'm sure you all used the sysobjects, syscolumns, ... and many others quite often. With SQL Server 2005 however there are some changes and lots of these tables have replacement views and extra information although the old tables are kept for backward compatibility.

Check the BOL for "Mapping SQL Server 2000 System Tables to SQL Server 2005 System Views"
ms-help://MS.SQLCC.v9/MS.SQLSVR.v9.en/tsqlref9/html/a616fce9-b4c1-49da-87a7-9d6f74911d8f.htm

This contains a nice overview of the old tables and their new replacement view. Also remember the ANSI standard INFORMATION_SCHEMA views which still exist in SQL Server 2005 of course.

Monday, November 28, 2005

Clock Hands

A very interesting post from our favorite SQL Memory Management Guru.

Oh yes... my first day @ work :-)

Monday, November 21, 2005

Layla

It's been a long time but there is a good reason ;)

My wonderful babygirl Layla was born on 14/11/2005
Weighing 3,110 kg and measuring 48,5cm.

Sleepless nights here they come :-s

Thursday, November 10, 2005

Microsoft SQL Server Management Studio

Like all people I tend to stick to what I know (especially when it comes to tools). I started with SQL Server 6.5 and back in those days isqlw (later Query Analyzer) was really your friend because anything else than creating a table was not really a joyful process. Having this background I am big supporter of Query Analyzer which after all those years I still start using Start - Run - isqlw :-D

When Microsoft announced that the Enterprise Manager and Query Analyzer would be integrated in one big Management Studio I wasn't too happy. Obviously feelings are irrelevant in IT and I felt that I had to face reality one day or another; so I decided to install Management Studio and to start using it from now on.

After some time you will notice that you get used to everything. I have to admit that my unhappy feelings are fading away and I'm beginning to see the 'cool' features of Management Studio. One big improvement is to script all your modifications to an object directly to a Query window instead of the clipboard. Some are simple but very useful like the ability to resize/maximize every window. When it comes to big databases the tree is more intelligent so that it only loads the parts you select. Having to load all tables, stored procedures, ... from all databases while you are perhaps only interested in 1 table in 1 database doesn't make any sense right?

One of the things I immediately changed were the keyboard shortcuts. I'm quite addicted to sp_help and sp_helptext and being able to select an object and get all it's information with a simple shortcut has saved me many hours.

Wednesday, November 09, 2005

Tuesday, November 08, 2005

Reporting Services 2005 for clientside reports

Aha, I've been given the assignment to investigate the feasibility of Reporting Services 2005 for our online reports in the client application.

This however requires some specific features:
- multiple language
- take xml as input (using xpath in textboxes)
- import of ActiveReports files
- ...

Does anyone have any experience with this yet?

Thursday, November 03, 2005

SQL Server 2005 Adoption Rates

Check out this report on migrating to SQL Server 2005 by Edgewood Solutions.

Monday, October 31, 2005

Uninstall SQL Server 2005 CTP

With the release of the RTM version you probably all want to remove the CTP version ;-)
There is a handy tool to help you with this process called sqlbuw.exe which can be found in the CTP install directory under Setup Tools\Build Uninstall Wizard\sqlbuw.exe

Friday, October 28, 2005

Visual Studio 2005/SQL Server 2005

Oh yes... blog post 1000 about the release probably :-)
But just to make sure... it has arrived!!!

Congratulations to the team and let it come to me now :-D

Thursday, October 27, 2005

MCITP: Database Developer

The Microsoft certifications have been reorganized and apparently there is a difference between a database administrator and a database developer.

A very welcome split.
The customer where I'm currently located uses this philosophy too. We have about 7 Development DBA's (including me) and 4 Operational DBA's.

Although most software developers are capable of writing queries it amazes me how little most of them *really* know about SQL Server. On the other hand a database developer's point of view is also very different from that of a database administrator.

I'd like to hear if there are any people out there opposing this split.

Anyway, check out the new certification here

Tuesday, October 25, 2005

PDC05

For those, like me, that were not that lucky to attend PDC in LA:
Get the sessions here

WAITFOR to keep your UMS thread

I just love Ken's explanations :-)
Here

Monday, October 24, 2005

Wednesday, October 19, 2005

Reporting Services Linked Report Generator

Wow this is one cool tool!
But remember... a fool with a tool is still a fool :-)

Tuesday, October 18, 2005

Does it matter if I build the clustered index before/after rebuilding the non-clustered indexes?

I was reading a QA on one of Kimberley's great sessions when I came across this interesting question and even more interesting answer. I had always been told that a rebuild of your clustered index causes a rebuild of your non clustered indices, apparently it needs to be more refined.


Does it matter if I build the clustered index before/after
rebuilding the non-clustered indexes?


You should always create the clustered index before creating non-clustered
index but as for rebuilding - you can rebuild your indexes independently as a
rebuild of the clustered does not (generally) cause a rebuild of the
non-clustered. There is one exception to this rule in SQL Server 2000 – in 2000
ONLY, SQL Server will automatically rebuild the non-clustered indexes when a
non-unique clustered index is rebuild. Why? Because the uniqueifiers are
rebuilt.


Thursday, October 13, 2005

NOLOCK vs Clustered Index Order Part IV

Having a heap doesn't influence the order.
Both queries return the data in the same random order.

I've asked the question to a lot of people already including MVP's, Microsoft Employees... an answer will come :-)

Unfortunately nobody has an immediate answer :-(

I'll keep you posted!

Wednesday, October 12, 2005

Ken Henderson's Guru's Guide to SQL Server Architecture

Please check Ken's post.
Let him know how you feel about including OS information in his new book!

Monday, October 10, 2005

NOLOCK vs Clustered Index Order Part III

Ok... even worse now :-p
Issuing a DBCC DBREINDEX solves the NOLOCK order problem but then again... not if you have multiple datafiles in your database.

Oh boy :-)

NOLOCK vs Clustered Index Order Part II

Still haven't figured out why exactly it's happening but here's a script for the people that want to try.

CREATE TABLE tblClustered
(ID int,
MyDate smalldatetime,
TestField varchar(50))
GO
CREATE CLUSTERED INDEX ixID ON tblClustered (ID, MyDate)
GO
DECLARE @i int
SET @i = 0
WHILE @i < 1000
BEGIN
INSERT INTO tblClustered (ID, MyDate, TestField) VALUES (RAND() * 1000, CONVERT(varchar, getdate(), 112), REPLICATE('T', 50))
SET @i = @i + 1
END
SELECT * FROM tblClustered (NOLOCK)
SELECT * FROM tblClustered

DROP TABLE tblClustered

NOLOCK vs Clustered Index Order

I'm trying to understand something.

I have a simple table with a Clustered Index on an integer field.

When I use 'SELECT * FROM tblClustered' the records are nicely ordered by the clustered index key. When I use 'SELECT * FROM tblClustered (NOLOCK)' however, the records are returned in a semi-random order. With semi-random I mean that the order is always the same but they are not sorted by the clustered index key. The queryplan is the same, I've tried DBCC DROPCLEANBUFFERS, CHECKPOINT, Restart of the SQL Server Service, you name it...

Obviously you should never rely on the clustered key for the sort order but I'm just trying to figure out what's happening 'inside' that makes this difference.

More to follow!

Performance Fix for AMD Dual Core users

Something for freaks like me :-)
There seems to be a performance fix for AMD Dual Core users with Windows XP SP2. It's seems like you have to contact Microsoft for the fix and it can't be just downloaded... weird :-s

Source

Business Intelligence

Hmm... now that SQL Server 2005 is getting closer I'd like to acquire some knowledge about Business Intelligence.

Let's start with this article

Wednesday, October 05, 2005

SQL Server 2005 Mission Critical High Availability

High availability with SQL Server 2005
Lots of interesting information here

SQLPass.NL

Check out our very own PASS for Dutch speaking people :-)
http://www.sqlpass.nl

Monday, October 03, 2005

Friday, September 30, 2005

Filter table on NULL value in Reporting Services

I was trying to find the correct filter statement for a NULL value in Reporting Services but I always seemed to fail.

So what's the trick?

Filter: =IsNothing(Fields!YourField.Value)
Operator: =
Value: =True

Notice the = before True as this really makes a difference!

Thursday, September 29, 2005

Query Optimizer vs Statistics

A nice Technet article on how the query optimizer uses statistics
Here

Monday, September 26, 2005

SQL Dependency Viewer

The people from RedGate keep amazing me with great tools.
Check out this tool... bye bye select * from sysdepends :-)

Sunday, September 25, 2005

Been a while

It's been a while but there is a very good reason for that :-(

There were some complications with the pregnancy of my girlfriend. The little one wanted to get out already but my girlfriend is only 30 weeks pregnant so it has to stay in a couple of weeks longer.

Anyway, a week and lots of medication later it seems to be under control... for the time being.

So as of tomorrow I should be back on track and posting interesting SQL stuff.

Friday, September 16, 2005

LINQ

A very weird innovation for a DBA :-)

LINQ

MSDN Webcasts

We all love them but can we find the time to listen to them?
Well Jeffrey Palermo found an interesting way... here

Database Mirroring out???

Hmmm I was reading this article by Paul Flessner when I came across this:

One lesson we learned is that we believe our Database Mirroring feature
needs more time in the hands of customers before we make it generally available
for production use. The feature is complete, has passed extensive internal
testing, and we are committed to making it generally available in the first half
of 2006. We will continue to field-test the feature with customers and will
release it for general use as soon as you tell us it is ready.

Wednesday, September 14, 2005

September CTP

Hmmm PDC is busy... where's my September CTP of SQL Server?? :-)
Aaah here it is

Saturday, September 10, 2005

Another holiday...

Aaah another week @ home!
Although holiday is a big word... some heavy construction work going on :-p I have to prepare the room for my first little one which is on the way!

Happy happy, joy joy :-D

Thursday, September 08, 2005

Microsoft hates DBAs

Funny article about SQL Server 2005 and the CLR integration :-)

Wednesday, September 07, 2005

bit or tinyint

I'm looking for some information about the following:
Is it better to use a tinyint field instead of a bit field if you are going to index it? Some people say it would probably be better to use a tinyint because the bit has to be extracted from the bitmask causing some overhead. I haven't found any real deep technical information about this but it sounds acceptable.

So if any of you have real hardcore stuff please post a comment :-D

In case I find some info I'll obviously send a little update!

Tuesday, September 06, 2005

Insufficient result space to convert uniqueidentifier value to char.

I needed a char representation of a GUID field but couldn't retrieve it because of an obscure error :-s

Insufficient result space to convert uniqueidentifier value to char.

This was caused by a CAST as varchar without specifying the length (or a too short length).

Just using CAST(field as char(36)) solved the problem.

Understanding TempDB, table variables v. temp tables and Improving throughput for TempDB

Yet another great post of one of my favorite SQL Server gurus :)

Finding Duplicate Indexes in Large SQL Server Databases

A nice article on how to detect duplicate indexes.
I suppose it happens more than you think...

Monday, September 05, 2005

Create a PDF from a stored procedure without using 3rd party utils

Hmm, a very funky way to create a PDF in SQL Server :-)
Check it out.

I can't say that I'm a fan of such practices but who knows someone might be able to use it out there.

Thursday, September 01, 2005

Uninstalling CTP

Want to get rid of previous CTP installations easily?
Check out this post. The VS 2005 Diagnostic and Uninstall Tool does the sometimes difficult task for you!

Tuesday, August 30, 2005

Override isolation level

When setting the transaction isolation level it could be nice to be able to control this on table level regardless of the level specified. SQL Server supports this by adding a WITH in the FROM clause.

Eg.

SET TRANSACTION ISOLATION LEVEL SERIALIZABLE

SELECT Field FROM table t1
INNER JOIN table2 t2 WITH (READUNCOMMITTED) ON t1.ID = t2.ID

This would cause serializable locks on t1 and read uncommitted on t2.

Monday, August 29, 2005

String Summary Statistics

I found this little interesting information in this article

SQL Server 2005 includes patented technology for estimating the selectivity of LIKE conditions. It builds a statistical summary of substring frequency distribution for character columns (a string summary). This includes columns of type text, ntext, char, varchar, and nvarchar. Using the string summary, SQL Server can accurately estimate the selectivity of LIKE conditions where the pattern may have any number of wildcards in any combination. For example, SQL Server can estimate the selectivity of predicates of the following form:

Column LIKE 'string%'
Column LIKE '%string'
Column LIKE '%string%'
Column LIKE 'string'
Column LIKE 'str_ing'
Column LIKE 'str[abc]ing'
Column LIKE '%abc%xy'

If there is a user-specified escape character in a LIKE pattern (i.e., the pattern is of the form LIKE pattern ESCAPE escape_character), then SQL Server 2005 guesses selectivity. This is an improvement over SQL Server 2000, which uses a guess for selectivity when any wildcard other than a trailing wildcard % is used in the LIKE pattern, and has limited accuracy in its estimates in that case. The String Index field in the first row set returned by DBCC SHOW_STATISTICS includes the value YES if the statistics object also includes a string summary. The contents of the string summary are not shown. The string summary includes additional information beyond what is shown in the histogram. For strings longer than 80 characters, the first and last 40 characters are extracted from the string and concatenated prior to considering the string in the creation of the string summary. Hence, accurate frequency estimates for substrings that appear only in the ignored portion of a string are not available.

Sunday, August 28, 2005

MySQL

I've been playing with MySQL for a project of a friend of mine (http://www.yawn.be).
I honestly love SQL Server even more now :-) I don't want to talk bad about MySQL itself because I don't know enough about the product to do so. But the tools provided are really sad as opposed to the tools that are included with SQL Server. I just wanted to know the queries that were run and their io, cpu, duration etc (<3 Profiler<3) but apparently there is no way with the standard tools to determine this. Graphical Query plans or even plans like the Text Query Plans in SQL Server... couldn't find it :-(

Friday, August 26, 2005

MSDN users to get early access to Visual Studio 2005 tools

According to this article in Infoworld we will have access to a release candidate of SQL Server 2005 and Visual Studio 2005 in September! The final versions would be available mid to end October.

Very good news!

Thursday, August 25, 2005

Using Sp_configure To Change a Value Will Issue DBCC FREEPROCCACHE

Another little known fact I guess but Brian Moran pointed out that issueing an sp_configure triggers the DBCC FREEPROCCACHE which removes all cached procedure plans.

SQL Server Magazine's Reader's Choice Awards

An the award goes to....... A First Look at SQL Server 2005 for Developers.

Congratulations to the authors!

Buy

Wednesday, August 24, 2005

Adding a non nullable column without a default

When trying to add a column with a default that is not nullable to an existing table you get an error message (even when there are no records in the table)



ALTER TABLE only allows columns to be added that can contain nulls or have a DEFAULT definition specified. Column 'NewColumn' cannot be added to table 'tbl_test' because it does not allow nulls and does not specify a DEFAULT definition.

There is however an easy workaround.

ALTER TABLE dbo.tbl_test
ADD NewColumn smalldatetime NOT NULL
CONSTRAINT DF_ToDrop DEFAULT ('2000-01-01')

ALTER TABLE dbo.tbl_test
DROP CONSTRAINT DF_ToDrop

Google Talk

Google released an IM GoogleTalk.
Looks like MSN could get a decent competitor

Ken Henderson

For those of you that don't have it already... buy the book!

Tuesday, August 23, 2005

sql_variant

sql_variant is one of the datatypes that I don't often see being used although it has it's advantages. It's kind of like a varchar where you would be able to store numeric characters, alphanumeric characters, dates, ... with the difference that on a sql_variant field you would be able to determine of which datatype the value is.

One tip I would give you is to always explicitly cast the value you insert (or update) to the datatype you want it to be. Apparently SQL Server uses 2 extra bytes to determine the extra information like the datatype etc. (eg. int would be 6 bytes).

The SQL_VARIANT_PROPERTY function gives you information about the variant itself like datatype, total bytes, precision ...

For more information visit:
sql_variant
SQL_VARIANT_PROPERTY

An interesting Q&A with Kimberly L. Tripp

Read it here

Monday, August 22, 2005

Back and already having fun

Aaah back from my 2 week holiday. I can't say I'm thrilled to be back but I can live with it ;-)

Especially when I find things like this :-)

Here's a nice article about fragmentation

Thursday, August 11, 2005

Holiday

Hi,

I've got 2 weeks off so you might not read very much this week and the week to come :)
Take care

Thursday, August 04, 2005

Wednesday, August 03, 2005

This must be painful

Paul Thurrott wrote an article where he urges us to boycott IE because of it's lack of support for standards. Now someone has written a nice reply to this...

Ouch ouch ouch

Let’s get into standards now. Guess what, Paul? Your site,
WinSuperSite.com currently has 124
validation errors
, according to the W3C’s Markup Validation Service.
Even worse, the page which contains your “Boycott IE” story currently has 207
validation errors
. Both pages don’t even define the page’s doctype,
which is almost always the first line of the web page.

Monday, August 01, 2005

Page splits

Page splits are not too good for performance and should be closely watched.
One way of determining whether you are having page splits and how many is to use the function ::fn_dblog (@StartingLSN, @EndingLSN). Passing NULL to the two parameters makes sure that the whole log is being read.

eg.

SELECT [Object Name], [Index Name], COUNT(*)
FROM ::fn_dblog(NULL, NULL)
WHERE Operation = 'LOP_DELETE_SPLIT'
GROUP BY [Object Name], [Index Name]
ORDER BY COUNT(*) DESC

You can reduce page splits by specifying a good fill factor. There is no real rule of thumb about fill factors. It all depends on the rowsize and number of rows inserted (between index rebuilds). So specifying a good fill factor and a DBCC DBREINDEX from time to time will minimize the number of page splits.

Friday, July 29, 2005

An unexpected error occurred in Report Processing. (rsUnexpectedError)

I'm creating one of the most important reports for the business and I'm having a weird problem. The report renders just fine in the browser but as soon as I try to export it to a PDF (which is basically what we do with all our reports - of course automatically through the webservice) it crashes with the above mentioned error.

After some surfing I came across an article in the Reporting Services newsgroup where one MattiasT had the same problem and he found the conditions for it to happen.

1. You are using a table.
2. The HideDuplicates property is used on a detail cell in this table.
3. The table contains a subreport.

And sure enough... when I remove the HideDuplicates the report generates just fine.

Unfortunately there seems to be no solution for the time being, except for changing the way you create the report because having all 3 conditions is a prerequisite.

Thursday, July 28, 2005

TempDB and missing Device

Just had a nice little incident :-s

One of our development servers was used for tests with a SAN disk. They moved our TempDB to the SAN and after the tests removed the SAN disks. No problem, but they forgot to move the TempDB location to the local disks. Result? No more SQL Server :-x

In order to get it back up we did the following:
Start SQL Server from the command prompt with the following syntax
sqlserver -f -c -T3609 -T4022

-f: minimal configuration mode
-c: not as a service (console mode so you can follow the nice little thingies SQL Server does)
-T3609: skip tempdb creation at startup
-T4022: bypasses automatically started procedures

Then we updated sysdatabases and sysaltfiles (I know it's hardcore) with the correct file locations.

Restart SQL Server and GO GO GO

The Daily WTF (What The F*ck)

For those of you that don't know the site, this is really one you should visit every day ;-)
It's a website that gathers programming stupidity on a daily basis.

Tuesday, July 26, 2005

Monday, July 25, 2005

Mindi Abair

Heard some nice music yesterday. I was chilling with some music on SmoothJazz when I heard a song by Mindi Abair called Make A Wish, very nice indeed! When I was all chilled out I switched to Radio Paradise which is really a very good radio station.

Check them out.

Week @ Home

Aaah, just got back from a little holiday :-s
It's good to be back but I liked it more being home ;-)

Friday, July 15, 2005

CHAR and NULL

CHAR fields always take up the space assigned to them. Now you may think that it wouldn't be the case if it was NULL because that is determined by a bitmask in the row data (this bit is always present in SQL Server 2000 even if your column is not nullable).

This was true in SQL Server 6.5 but not in SQL Server 2000. So if you really need a nullable CHAR field it may be better to define the field as VARCHAR.

Wednesday, July 13, 2005

Report Packs

If you are working with Reporting Services please don't forget to check here once in a while.
Microsoft provides some nice Report Packs from time to time.

Have fun

Monday, July 11, 2005

New computer - installed

Installed and man does it fly!
I am very pleased with the results although I had some trouble with XP64 and I had to install XP32 again :-(

Thursday, July 07, 2005

Kimberly L. Tripp on developing for SQL Server

Check it out here

A bold statement: "Open source could damage the market"

Bob Hayward, senior vice president and chief research officer, Gartner Research Asia-Pacific, said that software developers could be discouraged from creating new software because of the multitude of open source software available for free. This is further driven by major vendors that are making their software available as open source.

Check out the complete article here

Software Patents... No no no

The EU has voted against "Software Patents".
Of the 732 people 648 voted no, a clear vote imho.

I suppose this will not be the end of the whole story...

Wednesday, July 06, 2005

New computer - update

Yes yes yes :)
It has arrived... only the CPU was out of stock so I have to wait a little bit :'( If all goes well it should arrive tomorrow.

I also decided to buy another case :-)

Tuesday, July 05, 2005

SAN Phase II

We got to talk with the people responsible for the SAN.
Seems our plans are quite useless, the layout of the disks has changed :-( Anyway, we decided on a new plan. Currently +200 disks are assigned to our development servers. We're waiting for the draft as proposed by them (considering our remarks of course).

Friday, July 01, 2005

Those dreadful 'Software Patents'

Check out this post by Clemens Vasters

Tetris SAN

Hmmm... we're trying to determine the layout of our development servers databases and files on the new SAN. We have to place 7.5TB for 48 servers in 14GB blocks... it looks a lot like tetris :-D

Thursday, June 30, 2005

AWE 3GB PAE... WTF?

Lately we have been experimenting with the memory options in our tests of a new server. Sometimes it's not quite clear what all those options do, I'll try to give a short and clear description here.

/3GB

Windows normally allows user mode applications to use 2GB of memory. The switch moves the starting point of the kernel to 3GB making it available to user mode applications. Windows 2003 added an extra switch userva to customize the exact amount between 2GB and 3GB. In short this makes it possible for your applications to use 3GB instead of 2GB of memory. Do note that there may be some drawbacks when your server is very powerful. In some cases the server might be able to handle so many threads that it's 1GB of kernel memory is insufficient.

Note Microsoft Product Support Services strongly recommends using a range
of memory for the /USERVA switch that lies within the range of 2800-3030. This
range is wide enough to provide a large enough pool of system page table entries
for all currently observed issues. Usually a setting of /userva=2800 will
provide close to the maximum available number of system page table entries
possible.


PAE (Physical Address Extension)

PAE allows Windows to use more memory than 4GB in combination with AWE

AWE

AWE is an API that enables programs to address more than 4GB (with PAE). It enables them to reserve memory as 'non-paged' memory. This means that memory taken by AWE is never paged to disk until the applications is closed or explicitly frees it.

Wednesday, June 29, 2005

8 Steps to better Transaction Log throughput

For those of you who don't read Kimberly Tripp's blog: check out this great post
And after that add her website to your favorites!

Tuesday, June 28, 2005

New Computer

It's finally time for a new computer. I've got this one for 3 years already and in the geek world that's a mighty long time :-p

This is what I had in mind:

MSI S939 K8N SLI Platinum nForce4UltraSLI 4DCDDR4 2PCI-X Sata 7.1ch FW GLan

AMD Athlon 4200+ 64Bit X2 Dual-Core Processor Socket 939 Boxed with cooler

GEIL Ultra-X Series Dual Channel 1GB 2x512MB PC3200 400MHz Selected 5ns CL 2 5-2-2 GLX1GB3200DC

Western Digital Raptor 74Gb E-Sata 10000rpm 8Mb cache 5,2Mms

Maxtor 300Gb 7200rpm SATA 150 16Mb DiamondMax 10

Gainward PCI-E GT6600GT PowerPack! Ultra/1980PCX TV-DVI 256Mb 2ns DDR3 Fan

Sharkoon Silvation Case - Airflow by included Airtunnel - Alu front panel Screwless - w/o PS

Power Supply 400Watt ATX JJ-400PPGA Silent Intel 2 20/24 pin 8cm SATA - PCI-E Retail + power cable

NEC DVD +R/+-RW ND-3540 Black OEM Dual Layer

DVD LG 8161B 48x16x Tray Blk Black (black or white + black front)

Floppy disk drive Sony 1,44MB Black

A lot of the parts are not available right now so I'll have to wait a while until they are delivered. Finally I will be able to install Windows XP 64 bit, and then probably notice that half my programs no longer work :-s

Any comments on what to buy or not to buy are of course welcome!

Monday, June 27, 2005

Microsoft Update Services

Microsoft has released it's new version of Windows Update.
Not only will it update your Windows but also your Exchange Server, SQL Server, Office, ...

Please check for details as not all versions of the above mentioned products are supported.

Check it out here

BINARY_CHECKSUM AND CHECKSUM

In all the years I've been working with SQL Server I have never seen anyone use these functions.

It's purpose is to build hash indices speeding up queries by matching a 4-byte int instead of a costly character comparison. Especially on large text columns the function can be of use.

For more information check out the BOL.

I'll take the example from BOL (c) Microsoft

-- Create a checksum index.
SET ARITHABORT ON
USE
Northwind
GO
ALTER TABLE Products
ADD cs_Pname AS
checksum(ProductName)
CREATE INDEX Pname_index ON Products (cs_Pname)

/*Use the index in a
SELECT query. Add a second search
condition to
catch stray cases where
checksums match,
but the values are not
identical.*/
SELECT *
FROM Products
WHERE checksum(N'Vegie-spread') =
cs_Pname
AND
ProductName = N'Vegie-spread'
There's also a little brother called CHECKSUM_AGG which returns the checksum of the values in a group. The example in BOL uses it to track changes in data.

SQL Server 2005 System Tables and Views

A nice article about this topic can be found here

Daddy

Since my father is advertising my weblog I feel I should return the favor.
Please visit his website here, although he's from another generation you should really check it out. My parents are quite modern and the website really shows this :-)

Friday, June 24, 2005

Microsoft Anti Spyware

A new beta build has been released
Check out this link for more information

Tuesday, June 21, 2005

MSDN TV - Reporting Services 2005

A nice MSDN TV movie with tips and tricks for Reporting Services in SQL Server 2005:
Download

TempDB guidelines

Wow, I found a really good article on TempDB in SQL Server 2005.
Check it out here and a little update here

SQL Server 2005 Resources

An excellent list of SQL Server 2005 resources can be found here

Assessments for SQL Server 2005

There are some assessments available on SQL Server 2005.
I scored 70% on the SQL Server 2005 for Developers test, not too bad for a first time I hope :-)

In case you wonder, I'm not bored, I'm just testing some processes that take quite some time ;-)

Time for some marketing material :-)

SQL Server outperforms Oracle on SAP (4 CPU platform) here

And a security comparison here

Microsoft Windows Server 2003 Performance Advisor

Check out this cool program from Microsoft!
It's a diagnostics program that collects performance information. Once the program collected the data you can get a nice overview of all the information captured. It is a highly configurable program and not for the faint of heart. It supports trending if you have a SQL Server 2000 database available.

Monday, June 20, 2005

SQL Server 2005 - June CTP Installed

Aha finally gotten around to installing SQL Server 2005 June CTP.
I wanna play but it's too late :( I need some rest if I want to make it through the week :)

AWE vs SQL Server 2000 SP4 - Fix available

Aha!
The fix for the AWE/SP4 problem has arrived!

Friday, June 17, 2005

Creating covering indexes in SQL Server 2005

CREATE INDEX INCLUDE statement (SQL Server 2005)

I've been testing the new INCLUDE statement in 2005.

From the BOL:


INCLUDE ( column [ ,... n ] )

Specifies the nonkey columns to
be added to the leaf level of the nonclustered index. The nonclustered index can
be unique or nonunique.

The maximum number of included nonkey columns is 1,023 columns; the minimum
number is 1 column.

Column names cannot be repeated in the INCLUDE list and cannot be used
simultaneously as both key and nonkey columns. For more information, see Index
with Included Columns.

All data types are allowed except text, ntext, and image. The index must be
created or rebuilt offline (ONLINE = OFF) if any one of the specified nonkey
columns are varchar(max), nvarchar(max), or varbinary(max) data types.

Computed columns that are deterministic and either precise or imprecise can
be included columns. Computed columns derived from image, ntext, text,
varchar(max), nvarchar(max), varbinary(max), and xml data types can be included
nonkey columns as long as the computed column data types is allowable as an
included column. For more information, see Creating Indexes on Computed
Columns.

Didn't have much time but I did a couple of quick tests. Just created a simple table with a covered index 'the old way' and then 'the new way'. From what I've seen until now the duration is not much less but inserting 10000 records in to the table takes about 10000 reads less with the INCLUDE index. From the info above I suppose it has something to do with the fact that the nonkey column is only present on the leaf level and not throughout the whole tree like a key column.

As soon as I find some time to test is a little deeper I'll try to let you know!

SQL Server 2005 - June CTP

Oh yeah, I forgot to mention it :-s
As you probably already know... SQL Server June 2005 Community Technology Preview is available!

Go go go!

AWE vs SQL Server 2000 SP4

There appears to be a fix for the AWE problem in SP4 of SQL Server 2000.
The bad news is that it's not available yet at least not to the public, although it should be available through PSS.

Report Builder available in Standard Edition of SQL Server 2005 Reporting Services

Some good news from Microsoft, and I quote Paul Flessner:

In February, we announced the product line for SQL Server 2005. As part of
this announcement, we revealed that Reporting Services would ship with
Standard
and Enterprise Editions and that Report Builder would be available
with
Enterprise Edition. Based on your feedback, and because we believe that
reporting is a core scenario for all customers, we've made the decision to
include Reporting Services in all versions for SQL Server 2005 including
Express, Workgroup, Standard and Enterprise Editions. In addition, Report
Builder will be available with Workgroup, Standard and Enterprise
Editions.
I think it's great that Microsoft listens to their users... keep up the good work!

Thursday, June 16, 2005

How to avoid "Parameter Sniffing" in SQL Server 2005

Good news for people with 'parameter sniffing' problems.
Although Ken Henderson gave us a nice solution some time ago for SQL Server 2000, SQL Server 2005 will support the OPTIMIZE FOR statement. Adding this to a query makes sure the plan is optimized for the value you specify. And when all else fails you can even force a particular query plan with the new USE PLAN statement. I'll try to give some examples as soon as I find some time.

Tuesday, June 07, 2005

Visual Studio 2005 and SQL Server 2005 release date announced

Just as announced a couple of months ago (although not officially), Microsoft Sets Visual Studio, SQL Server, BizTalk Server Launch Date in November.

More information here

As part of the keynote address, Paul Flessner, senior vice president of
Server Applications at Microsoft, showed the company’s continued momentum in
preparation for the launch of SQL Server™ 2005, Visual Studio® 2005 and BizTalk®
Server 2006, and announced that these products will be formally launched during
the week of Nov. 7.

Reporting Services still doing a great job

Our Reporting Services solution is still running fine :-D

This gives me the chance to leave for Spain in a happy mood. We (our company) are going on a trip to Spain from Wednesday (yep that's right, tomorrow :p) until Tuesday next week. While we are there there will be 2 courses. One will be about SQL Server 2005 and the other one about Visual Studio 2005 and obviously .NET 2.0. Two great topics imho although many people will disagree :)

The weather is looking quite nice for the time being ;)

Friday, June 03, 2005

Reporting Services doing a great job

In other news, our Reporting Services solution is delivering 20-25 reports per second (easily getting the 8 per second which was our goal. The reports are available around 01:10am where the target is 07:30am, I guess we can generate some more :-D

A couple of minor data issues the first 2 days in production but everything went quite well.

/me is happy

How to avoid "Parameter Sniffing" in SQL Server 2000

Here's a great article about parameter sniffing.
Funny he's using Belgium as an example, it's where I live :)

Monday, May 30, 2005

Tomorrow will be the big day.
The first test run in production today was a bit disappointing :( Instead of 32 we were only getting 17 reports per second. Although that is still twice the amount we need I would have hoped that the performance would be better. The problem is that there are other processes running (and quite heavy ones) while the reports are being generated. Because the new SAN has to be installed we are running all this on local disks atm.

Friday, May 27, 2005

I was browsing http://www.sql-server-performance.com when all of the sudden I saw this article. It's not the article but who wrote it what makes it interesting. It's an ex colleague who apparently found some time to write up this nice article. Nice going Geert ;)

Thursday, May 26, 2005

This Sunday will be the release of our Reporting Services solution. I'm getting kind of nervous as they haven't installed our production servers yet and they should be ready by the end of the week. There is a lot of pressure on the solution as the current solution delivers the reports with unacceptable delays. Another problem is that the people from the union are very displeased with this. Generating the reports on time and with the correct information is of major importance.

I'll let you know whether I'm happy or sad come Tuesday :-)

Friday, May 20, 2005

I saw this cool product called SSW SQL Audit.
As soon as I find some time I'll test it. I'll keep you updated.

http://www.ssw.com.au/SSW/Products/ProdSummaryList.aspx
UPDLOCK

The UPDLOCK locking hint is a very interesting one.
It prevents other connections from updating a record but allows other connections to still read the information. It assures you that your process will be the next in line for an exclusive lock. SQL Server actually uses update locks before acquiring an exclusive lock (eg. UPDATE with a WHERE clause). It happens so fast most of the times that they probably won't popup in the locking information you get to see.

To prevent conversion deadlocks (typical in a read and update transaction) it is a good idea to use this locking hint (you'll probably want a REPEATABLE READ or SERIALIZABLE isolation level for these kind of transactions too).

Thursday, May 19, 2005

Finally found some time to stress test our Reporting Services solution on the servers that will soon run in production.

With 1 DB server and 2 App servers we are generating 32 reports/sec. Quite a nice result as our target was 8/sec :-)

If you want more information or details feel free to contact me.

Wednesday, May 18, 2005

Reporting Services weirdness :s

In order to get a text from a database to display in the page footer you have to include a textbox in the data region and then reference this textbox in a second textbox which you create in the footer region.

No problem so far, a bit weird but not too much trouble. However, when exporting to PDF the footer would just display on the first page and not on all the following pages.

I have solved it by defining an extra parameter, clearing the prompt text and setting the default value to From Query. Select the correct Dataset and the field you would like to use.

Works like a charm.

There was one little issue left though. The report would fail if the stored procedure that retrieved the parameter value would not return any records. I have solved this with the following simple yet effective workaround:

DECLARE @Description varchar(1024)

SELECT @Description = [Description] FROM tbl_document_caption WHERE ...

SELECT @Description as [Description]
Check out this website.
It was created by a friend of mine and I promised I would promote it :-D

http://www.lollersaurus.com

Tuesday, May 17, 2005

It appears that there is an issue in SP4 for SQL Server 2000 with AWE.
From the Microsoft website:

Warning: Microsoft has found an issue with the final build of SP4 that
impacts customers who run SQL Server with Address Windowing Extensions (AWE)
support enabled. This issue only impacts computers with more than two gigabytes
(2 GB) of memory where AWE has been explicitly enabled. If you have this
configuration, you should not install SP4. Microsoft is currently working on the
problem and will issue an update soon.

Wednesday, May 11, 2005

64bit in the everyday environment.
By the looks of this column is doesn't look too good (for the time being though).
Here you can find an interview with Simon Galbraith, co-founder of Red Gate.

"There is no one on the planet who can tell you with a greater degree of certainty that software marketing isn’t rocket science ?! "

Red Gate has some amazing tools for SQL Server; you should check them out.

Monday, May 09, 2005

Here's another interesting post about Memory Management in Windows.

https://blogs.msdn.com/slavao/archive/2005/01/29/363181.aspx
Service Pack 4 for SQL Server 2000 has been released!

http://www.microsoft.com/sql/downloads/2000/sp4.asp

In case you guys don't know her already.
She is one of the absolute gurus in the obscure SQL Server world :-)

http://www.sqlskills.com/blogs/kimberly/

Her blog isn't updated on a daily basis but when she updates it you can be sure that it's very usefull information.

Here's an interesting article on JOIN hints.

Transact-SQL Blog: Optimizer Join Hints

Wednesday, May 04, 2005

I was trying to determine whether I had put all of my tables on the correct filegroup and I wanted to list this. I came up with the following query to quickly list all the needed information.

SELECT o.Name as 'Table Name', g.GroupName
FROM sysobjects o
JOIN sysindexes i on i.id = o.id
JOIN sysfilegroups g on g.groupid = i.groupid
A lot of people tend to use DBCC FREEPROCCACHE when they are performance testing procedures. This is obviously a very good habit, however, when sharing a server with other databases you are clearing the cache for all databases. You can prevent this by using DBCC FLUSHPROCINDB(dbid) which only clears the cache for a specific database.

Tuesday, May 03, 2005

Found this old but yet very useful article in SQL Server Magazine (InstantDoc 40925).
It is an in depth study of DBCC SQLPERF (WAITSTATS) with a lot of information on what to monitor.

There are a couple of interesting formulas in the article that came to my attention.

Initial Compiles(%) = (SQL Compilations/sec - SQL Recompilations/sec) / Batch Requests/sec
Plan Reuse (%) = (Batch Requests/sec - Initial Compiles/sec) / Batch Requests/sec

To determine the optimal Packet Size use the following formula:
Average Number of Bytes per packet = Bytes Total/sec / Packets/sec

Wednesday, April 27, 2005

In our journey for new questions to ask on an interview I came up with an interesting question. Although it is not too difficult there are not too many people who really know the difference between DELETE (without WHERE) and TRUNCATE.

TRUNCATE is faster than DELETE is not a complete answer (although true), but what makes it faster.

DELETE removes the rows one at a time and records this in the transaction log. TRUNCATE removes them by deallocating the data and index pages and only this action is logged.

There are a couple of things to note though:
- If a foreign key references the table TRUNCATE will not work
- If the table is publishing data for replication TRUNCATE will not work
- Delete triggers are not fired

Many people believe that TRUNCATE is not logged, this is of course not true. You can use TRUNCATE in a transaction (try using it in a transaction and rollback that transaction).

Wednesday, April 20, 2005

New beta's have arrived for both Visual Studio 2005 and SQL Server 2005!
Go get 'em!

Friday, April 15, 2005

Hmmm very very weird behaviour :s

We have an Execute SQL task that calls a stored procedure containing xp_cmdshell to BCP out 2 tables and some other stuff. Somewhere after the export of the second one the step reports to be completed successfully although there are a couple commands that need to be executed after the BCP (in the same stored procedure).

Apparently - still investigating though - using NO_OUTPUT solves the problem
Ok :x
We've been looking for a problem in a DTS package for 3 days with 2 people already.
I've stumbled on the following KB which explains why we were probably looking on the wrong places :(

FIX: A DTS Package That Uses Global Variables Ignores Error Message Raised by RAISERROR

Solved in a hotfix and of course in SP4 which should be arriving around June (I guess)
Time for the next tip too :)

Tip 2

What is the fiber mode used for and when should I use it?

SQL Server 7.0 and 2000, by default, run in what is called "thread mode." What this means is that SQL Server uses what are called UMS (User Mode Schedulers) threads to run user processes. SQL Server will create one UMS thread per processor, with each one taking turns running the many user processes found on a busy SQL Server. For optimum efficiency, the UMS attempts to balance the number of user processes run by each thread, which in effect tries to evenly balance all of the user processes over all the CPUs in the server.

SQL Server also has an optional mode it can run in, called fiber mode. In this case, SQL Server uses one thread per processor (like thread mode), but the difference is that multiple fibers are run within each thread. Fibers are used to assume the identity of the thread they are executing and are non-preemptive to other SQL Server threads running on the server. Think of a fiber as a "lightweight thread," which, under certain circumstances, takes less overhead than standard UMS threads to manage. Fiber mode is turned on and off using the "lightweight pooling" SQL Server configuration option. The default value is "0", which means that fiber mode is turned off.

So what does all this mean? Like everything, there are pros and cons to running in one mode over another. Generally speaking, fiber mode is only beneficial when all of the following circumstances exist:

--Two or CPUs are found on the server (the more the CPUs, the larger the benefit).

--All of the CPUS are running near maximum (95-100%) most of the time.

--There is a lot of context switching occurring on the server (as reported by the Performance Monitor System Object: Context Switches/sec. Generally speaking, more than 5,000 context switches per second is considered high.

--The server is making little or no use of distributed queries or extended stored procedures.

If all the above are true, then turning on "lightweight pooling," option in SQL Server may see a 5% or greater boost in performance.

But if the four circumstances are all not true, then turning on "lightweight pooling" could actually degrade performance. For example, if your server makes use of many distributed queries or extended stored procedures, then turning on "lightweight pooling" will definitely cause a problem because they cannot make use of fibers, which means that SQL Server will have to switch back-and- forth from fiber mode to thread mode as needed, which hurts SQL Server's performance.

Also note that some features no longer work when enabling this mode like SQLMail.

(c) http://www.sql-server-performance.com
Aaah we're going to Spain in a month or 2 with the company. This is a yearly habbit but this year 2 people of MCS will be joining us. There will be a training about SQL Server 2005 and one about Visual Studio 2005. Really looking forward of course :)

Although the "Feria" is quite nice too :p

Thursday, April 07, 2005

Hmm long time ago, been really busy @ work and @ home ;)

Read about some interesting new features in SQL Server 2005.
- CTE
- CROSS/OUTER APPLY
- DELETE/UPDATE OUTPUT INTO
- INSERT WITH MERGE
- XPath Queries in XML columns
- Triggers on DDL
- ...

The list is long and very very interesting :(

I can hardly wait to use it but unfortunately the release will be around the end of the year and before companies start using it will take some time too. At home I will of course enjoy it to it's fullest :)

Wednesday, March 16, 2005

Debugging in a production Environment

Check out this link, the article is written by a colleague of ours and it's quite nice (kk biased information but I still think it is :p)

Monday, March 14, 2005

Over the years I've gathered a list of FAQ's about SQL Server.
I'll try to post some of them from time to time. Please note that some content might be from other sources than myself, I will try to mention them as accurate as possible!

Tip 1:
A good query to view index information (© Kimberly Tripp)

SELECT object_name(si.[id]) AS [TableName]
, rowcnt AS [Row Count]
, CASE
WHEN si.indid = 0 then 'Heap'
WHEN si.indid = 1 then 'CL'
WHEN INDEXPROPERTY ( si.[id], si.[name], 'IsAutoStatistics') = 1 THEN 'Stats-Auto'
WHEN INDEXPROPERTY ( si.[id], si.[name], 'IsHypothetical') = 1 THEN 'Stats-HIND'
WHEN INDEXPROPERTY ( si.[id], si.[name], 'IsStatistics') = 1 THEN 'Stats-User'
WHEN si.indid between 2 and 250 then 'NC ' + RIGHT('00' + convert(varchar, si.indid), 3)
ELSE 'Text/Image'
END AS [IndexType]
, si.[name] AS IndexName, si.indid
, CASE
WHEN si.indid BETWEEN 1 AND 250 AND STATS_DATE (si.[id], si.indid) < DATEADD(m, -1, getdate())
THEN '!! More than a month OLD !!'
WHEN si.indid BETWEEN 1 AND 250 AND STATS_DATE (si.[id], si.indid) < DATEADD(wk, -1, getdate())
THEN '! Within the past month !'
WHEN si.indid BETWEEN 1 AND 250 THEN 'Stats recent'
ELSE ''
END AS [Warning]
, STATS_DATE (si.[id], si.indid) AS [Last Stats Update]
FROM sysindexes AS si
WHERE OBJECTPROPERTY(id, 'IsUserTable') = 1
ORDER BY [TableName], si.[indid]
64-Bit and SP1 for Windows 2003 Server... it's coming!
click

Wednesday, March 09, 2005

Nice post about concurrency... be sure to read the comments too!

Tuesday, March 08, 2005

If you're into Reporting Services check out http://sqldbatips.com/showarticle.asp?ID=62
This is a nifty little tool that scripts your reports for deployment on other servers, it also generates a handy batchfile for the deployment.

Thursday, March 03, 2005

Uninstalling Reporting Services

Nice little problem here :x
Someone installed our DEV servers with French as the regional setting. Because my colleague and I don't like French (on a computer) we decided to change it to English. We had to uninstall RS to start with a fresh installation but that wasn't what RS wanted. The uninstall attempt just gave a message box "Fatal Installation Error". Very descriptive as an error :( When uninstalling from the setup.exe we would just get 1603 as an error code.

Reinstalling the French regional setting solved the problem. Uninstalled RS, uninstalled French and installed RS again.

Voulez-vous supprimer l'application?
For all you lazy people : http://www.uimapper.net/
Haven't tried it yet but I'm planning to use it sometime... lazy as I am ;-)

Monday, February 28, 2005

ON DELETE SET NULL

Check out this article about this new feature in SQL Server 2005
SQL Server 2005

Still waiting for Beta 3... latest rumors say it will be released in Q1. Although some say it will probably be June 5th on TechEd. Patience is a virtue I suppose :-)

These editions will be available:

  • SQL Server 2005 Enterprise Edition , a complete data and analysis platform for large mission-critical business applications
  • SQL Server 2005 Standard Edition , a complete data and analysis platform designed for medium-sized businesses
  • SQL Server 2005 Workgroup Edition , an affordable, easy-to-use and simple-to-manage database solution for small to medium-sized organizations
  • SQL Server 2005 Express Edition , a no-cost, easy-to-use version of SQL Server 2005 designed for building simple data-driven applications
Some nice features available even in the Standard edition are 64-bit support, Database mirroring and clustering.

SQL Server 2000 Workgroup Edition will be released too but that's not as exciting as 2005 now is it ;-)

Check out the movie here.

Friday, February 25, 2005

I've been experimenting with the UnsafeAuthenticatedConnectionSharing setting in order to speed up the WebService call time. I must say that the results are quite satisfying for the time being. Going from several seconds to sub-second response time. In our solution the service gets called about 60.000 times in a row so performance is a key factor in the process. Obviously there are some downsides to this property. Make sure you carefully study the security aspect before implementing this. In our case it is an application that uses just one user account and runs on a server without user intervention.
Code profiling in .NET, sweet performance :)

Check out this article about various tools. I also tried the JetBrain NetProf which imho wasn't too bad either. If you don't know the people from JetBrain check out ReSharper, that tool seriously rocks!

Thursday, February 24, 2005

Finally got around to some C#'ing :-)
In order to generate the massive amount of reports I'm writing a multi-process, multi-threaded report generator.

Experimenting with the Enterprise Library (Patterns & Practices) which has some very nice features. The configuration tool is very handy and lets you visually configure the framework properties. Creating an Exception handler with logging is a matter of minutes.

To use it just create a reference to the DLL. Policies can be defined per exception and how to handle it.

try
{
int a = 10;
int b = 0;
int c = a / b;

}
catch(Exception ex)
{
bool rethrow = ExceptionPolicy.HandleException(ex, "General Policy");
if (rethrow)
{
throw ex;
}
}

The policy provides a couple of handlers:
  • Custom handler: you create your own handler class
  • Logging handler: logging to text file, eventlog, ...
  • Replace handler: replace an exception with a new one, for example to remove sensitive data from exception information
  • Wrap handler: to wrap an exception in another exception (InnerException), for example to pass exceptions from your data layer to your business layer which then handles the exception

I hope more posts will follow about the Enterprise Library!

Thursday, February 17, 2005

Recently installed Panda Antivirus 2005... boy oh boy the problems I have.

- explorer.exe peaks 99% cpu usage
- browsing My Computer takes 20 secs per click
- Outlook no longer starts
- ...

I started sending mails 3 weeks ago (with occasional reminders), still no answer received. I uninstalled it yesterday and I will never buy anything from Panda Software again :-( It's bad enough that it doesn't work but I go completely crazy when people ignore my problems :-) By the time they answer my computer could be infected with 1000 viruses.

Monday, February 14, 2005

We're trying to create a reporting solution with Reporting Services here.

Although I am impressed with the power and features of Reporting Services there are still a lot of questions. Apparently there is little information on the net concerning more advanced topics. In case we find answers to the more 'difficult' questions I'll make sure I post them here. Stay tuned!

Recently a very nice article was posted on MSDN about Snapshot Isolation, check it out http://msdn.microsoft.com/sql/default.aspx?pull=/library/en-us/dnsql90/html/sql2k5snapshotisol.asp

Oh yeah... happy valentine ;-)

Monday, January 31, 2005

DevDays 2005

Tomorrow (and Wednesday) I will be attending the DevDays 2005 (http://www.microsoft.com/belux/nl/devitprodays/). I'll be taking the SQL Server 2005 track of course.

I did my last exam 2 weeks ago and I'm now an MCDBA (for what it's worth).