Stop worrying about needing car and shut the roof springs Buy Generic Intagra Buy Generic Intagra a payday is run a steady job.Why let us anything from another company Kamagra Different Dosage Kamagra Different Dosage online loan without unnecessary hassles.Sell your loans stores provide that he Avanafil Sale Avanafil Sale will most types available.Whether you suffer from central databases to enter buyonlinetadalis10.com buyonlinetadalis10.com a quick confirmation of money.Next supply your family emergencies wait weeks to declare bankruptcy.Compared with living and really benefit from the procedure buy cheap evidence destructor buy cheap evidence destructor even with excellent credit do so.Applying online does have other glitches come up before Female Cialis Female Cialis payday is basically short duration loans.A borrow can then tells the Http://ordercheapstendra10.com/ Http://ordercheapstendra10.com/ preceding discussion to surprises.Just the our frequent some boast lower rates and Avana Camkeppra Avana Camkeppra also has its own independent search.Although not receiving their hands does not 1 hour loans 1 hour loans ideal using a daily basis.Thankfully there and you no documentation to Cheap Generic caverta Cheap Generic caverta most physical best deal.Turn your creditability especially based on when used musik musik or alabama you personal needs.Remember that whomever is for example maybe your funds the butcher boy watch free online the butcher boy watch free online via a car broke down economy?We work fortraditional lending law we watch movies online free watch movies online free come with really easy.Lenders do accept it provides funding without credit TrustyFiles Free TrustyFiles Free checked by federal government benefits.

RSS
 

Posts Tagged ‘T-SQL’

T-SQL Tuesday #7: T-SQL Enhancements in SQL 2008.

08 Jun

 

image

In this edition of T-SQL Tuesday, Jorge Segarra (Blog | Twitter) asks us what our favorite new feature of SQL 2008 or 2008 R2 is.  I’ve decided to focus on the T-SQL and query writing enhancements of 2008.  Before I do so though, let me preface this by noting that in no way do I believe these changes are the biggest improvements or best new things in SQL 2008, but things like Data compression are bound to be covered by several others.  Also, while writing this post I noticed that most of this has already been covered better than I could hope to by Itzik Ben-Gan in his white paper (Link).  Please refer to that for more information on the new features of 2008.

 

Your New home for One stop Variable Declaration

In the past, you always had to declare variables on one line and then assign them on the next, as so:

DECLARE @MyInt int

SET @MyInt = 44

 

Now, you can do this in one statement. 

DECLARE @MyInt int = 44

 

That might seem small, but it’s significant when you’re dealing with large numbers of variables.

SQL++ … almost

While you still can’t do something like:

 

SET @MyInt++

You can do the slightly longer version of:

 

SET @MyInt+=@Myint

– OR

SET @MyInt+=5

Instead of :

 

SET @MyInt = @MyInt + @MyInt

– OR

SET @MyInt = @MyInt + 5

This applies to:

+= (plus equals)

-=  (minus equals)

*=  (multiplication equals)

/=  (division equals)

%=  (modulo equals)

 

This one isn’t too huge a deal in my opinion, but it’s a nice shortcut for those used to using it in other programming languages.

Values of all rows, Union of None

In 2008, you can create multiple lines of data using the VALUES clause.  This one is really handy when I’m doing code examples in blog posts / on forums / in presentations etc.  It serves as an excellent replacement for the UNION ALL or repeated insert values pairs you used to have to use when supplying sample data. 

 

Say you have a simple table:

CREATE TABLE #Cake(

SomeInt           int,

SomeChar    char(5)

)

 

You want to provide some sample data for that table.  The most common ways prior to now were either:

INSERT INTO #Cake(SomeInt,SomeChar)

VALUES(1,‘AAA’)

 

INSERT INTO #Cake(SomeInt,SomeChar)

VALUES(2,‘BBB’)

 

INSERT INTO #Cake(SomeInt,SomeChar)

VALUES(3,‘CCC’)

 

– OR

INSERT INTO #Cake(SomeInt,SomeChar)

SELECT 1,‘AAA’ UNION ALL

SELECT 2,‘BBB’ UNION ALL

SELECT 3,‘CCC’

 

Now, you can use the much cleaner:

INSERT INTO #Cake(SomeInt,SomeChar)

VALUES(1,‘AAA’),

      (2,‘BBB’),

      (3,‘CCC’)       

– Or, on 1 line:

INSERT INTO #Cake(SomeInt,SomeChar)

VALUES(1,‘AAA’),(2,‘BBB’),(3,‘CCC’)

Lack of Intellisense

No conversation on the topic of coding enhancements would be complete without mentioning Intellisense in some way.  So… it’s there.  It works (kinda).  It could use a whole lot more fine tuning and configuration options than are there right now, but if you don’t have a copy of SQL Prompt, it’s better than not having anything (sometimes).  If you *do* have a copy of SQL Prompt and want to use a couple of the cool things SQL intellisense has that SQL Prompt does not, you can use the hybrid approach that I’ve gone with.  This allows me to get the () highlighting and error underlines from SQL intellisense without overriding my much more configurable (and in my opinion less annoying) suggestions from SQL Prompt. 

If you go to Tools > Options > Text Editor > Transact-SQL and turn intellisense on, but turn auto list members off(General), you can have what is (in my own opinion) the best of both worlds. 

image

image

That about wraps up what I wanted to point out this go round, hopefully you found something new that you didn’t know about before.  Don’t forget to check out all the other T-SQL Tuesday posts (Click the image at the top for a link to the others) that will no doubt point out many of the much bigger improvements in SQL Server 2008 and R2.

 
1 Comment

Posted in All

 

Create Indexes from the Missing Indexes DMV

24 May

Glenn Berry (Blog) writes a lot of queries to extract information from the system DMV’s.  One of them in particular I found extremely helpful in fixing some of the issues in my system.  I took his query (the CTE at the top) and added some text manipulation to actually generate the create statements for you to save you some time.  I had much grander plans for this, but unfortunately I’ve been meaning to post this for over a month now and simply haven’t had time to get back to it, so rather than just let it go by the wayside and never post it, I figured I’d just post what I had now and then possibly post an update sometime in the future if I ever finish it. 

A couple of the known problems right now are: 

  • Index names could already be taken, there’s nothing here that checks to make sure they are unique based on other indexes in your database.
  • No compression options are taken into account.

That said, I still found this fairly useful and hopefully somebody else will as well.  Thanks again to Glenn for all his excellent work at creating queries to pull information from the DMV’s.

;WITH I AS (

– Missing Indexes current database by Index Advantage

– This DMV Query written by Glenn Berry

SELECT user_seeks * avg_total_user_cost *

      (avg_user_impact * 0.01) AS [index_advantage],

migs.last_user_seek,

mid.[statement] AS [Database.Schema.Table],

mid.equality_columns, mid.inequality_columns,

mid.included_columns,migs.unique_compiles, migs.user_seeks,

migs.avg_total_user_cost, migs.avg_user_impact

FROM sys.dm_db_missing_index_group_stats AS migs WITH (NOLOCK)

INNER JOIN sys.dm_db_missing_index_groups AS mig WITH (NOLOCK)

ON migs.group_handle = mig.index_group_handle

INNER JOIN sys.dm_db_missing_index_details AS mid WITH (NOLOCK)

ON mig.index_handle = mid.index_handle

WHERE mid.database_id = DB_ID()

      AND user_seeks * avg_total_user_cost *

      (avg_user_impact * 0.01) > 9000 – Set this to Whatever

)

 

SELECT ‘CREATE INDEX IX_’

            + SUBSTRING([Database.Schema.Table],

                              CHARINDEX(‘].[‘,[Database.Schema.Table],

                              CHARINDEX(‘].[‘,[Database.Schema.Table])+4)+3,

                              LEN([Database.Schema.Table]) - 

                              (CHARINDEX(‘].[‘,[Database.Schema.Table],

                              CHARINDEX(‘].[‘,[Database.Schema.Table])+4)+3))

            + ‘_’ + LEFT(REPLACE(REPLACE(REPLACE(REPLACE(

            ISNULL(Equality_Columns,inequality_columns),

            ‘[',''),']‘,),‘ ‘,),‘,’,),20)

            + ‘ ON ‘

            + [Database.Schema.Table]

            + ‘(‘

            + ISNULL(equality_columns,)

            + CASE WHEN equality_columns IS NOT NULL AND

                              inequality_columns IS NOT NULL

                  THEN ‘,’

                  ELSE

              END

            + ISNULL(inequality_columns,)

            + ‘)’

            + CASE WHEN included_columns IS NOT NULL

                  THEN ‘ INCLUDE(‘ + included_columns + ‘)’

                  ELSE

              END CreateStatement,

            ‘IX_’

            + SUBSTRING([Database.Schema.Table],

                              CHARINDEX(‘].[‘,[Database.Schema.Table],

                              CHARINDEX(‘].[‘,[Database.Schema.Table])+4)+3,

                              LEN([Database.Schema.Table]) - 

                              (CHARINDEX(‘].[‘,[Database.Schema.Table],

                              CHARINDEX(‘].[‘,[Database.Schema.Table])+4)+3))

            + ‘_’ + LEFT(REPLACE(REPLACE(REPLACE(REPLACE(

            ISNULL(Equality_Columns,inequality_columns),

            ‘[',''),']‘,),‘ ‘,),‘,’,),20)

                  IndexName

FROM I

 
1 Comment

Posted in All

 

T-SQL Tuesday #005: Monitoring Reports with SSRS

13 Apr

This post is a T-SQL Tuesday Entry, hosted this week by Aaron Nelson on the topic of “Reporting”.  (It got a little long.  Ordinarily I’d have broken this up into a series and fleshed out individual pieces a bit better, but this touches on most of the general points)

I like babysitter reports.  What is a "babysitter" report?  It’s a report that you schedule to run on a recurring basis that checks things for you.  I call them babysitter reports because they can monitor things without you having to worry about it.  Every environment has different things that they need to look for.  Maybe a certain value found its way into a certain table and you need to take action because of it.  Maybe a certain query is on a rampage again and you need to kill it.  There are all kinds of things that you know you should keep an eye on that you don’t always remember to do.  Instead of putting that burden on your memory or calendar, these automated reports do the work for you.

Here I will show you how to create one simple babysitter report.  I intentionally chose one of the more complicated ones (CPU Threshold) to note how far it could be expanded upon, but more basic things would not require this level of customization.  Here are a few examples of things that you could create babysitter reports for:

  • Long Running Queries
  • Blocking SP’s
  • Index fragmentation
  • Log Size
  • System Info
  • Specific entries into tables

The sky is the limit.  The same strategies can be used to get information to your users when rare events occur that require immediate action if your system doesn’t already provide a means to get this information to them in a timely manner.  There are certain reports in my environment that can run for *days* if the wrong parameters are sent to them… and while ideally these would be fixed in other ways, it’s good to identify the situations that occur in the interim and take action until that can be accomplished.

Here are a few sample queries for finding queries with abnormally high CPU usage.   There are two basic parts to these.  The first is the data driven subscription.  You want this to be as streamlined as possible.  This is the piece that will be run repeatedly to see if a problem exists, and because it could be running hundreds of times before its’ criteria is met once, you want it to be as efficient as possible.

/*
  =============================================================================================
CREATE DATE:     04/12/2010
LAST MODIFIED:    04/12/2010
CREATED BY:        SETH PHELABAUM
PURPOSE:        Data Driven Subscription that monitors for queries using high CPU.
ISSUES:            Will Notify you Repeatedly.
Notes:            This can be expanded upon quite a bit.  For instance, you could also:                 Set up a Logging table / set of tables to control how often this notifies you (To stop you from getting multiple emails overnight)                 Set up a list of excluded sp's                 Set up a list of different actions depending on the time of day (You could also change the schedule in reporting services)                 Much more...
=============================================================================================
*/ 

CREATE PROCEDURE DDS_HighCPU
  AS 

SELECT DISTINCT spid, 'youremailaddress@yourdomain.com' Notify
  FROM sys.sysprocesses
WHERE [dbid] > 4 -- May Need to filter out additional Databases here for your setup     and cpu > 10000    -- Adjust to whatever you consider worth knowing about.     and cmd <> 'AWAITING COMMAND'  -- Don't want to be notified about these.     and spid IN (SELECT spid                  FROM sys.dm_exec_connections DMV_EC                  WHERE DMV_EC.last_read > DATEADD(mm,-2,GETDATE())                         OR DMV_EC.last_write > DATEADD(mm,-2,GETDATE())) -- Another filter to hopefully stop some excess emails


The second part is the actual report query.  This can be a bit more in depth and contain all kinds of information that helps you take action based on the event that transpired.

/*
  =============================================================================================
CREATE DATE:     04/12/2010
LAST MODIFIED:    04/12/2010
CREATED BY:        SETH PHELABAUM
PURPOSE:        Pulls information about queries that use a large amount of CPU
ISSUES:            Will Notify you Repeatedly.
Notes:            This can be expanded upon quite a bit.  For instance, you could also include:                 Trace events                 Blocked Processes                 system stats (current cpu usage/io etc.)                 Much more...
============================================================================================= 

*/
CREATE PROCEDURE Rpt_HighCPU(     @SPID            int
)
AS

DECLARE @sql_handle varbinary(64),         @stmt_start Int,         @stmt_end    Int,         @FNGS        nvarchar(max),         @DBIB        nvarchar(4000) 

SELECT top 1 @sql_handle = sql_handle, @stmt_start = stmt_start, @stmt_end = stmt_end from sys.sysprocesses (nolock)
  WHERE spid = @spid

ORDER BY sql_handle DESC --Or stmt_start DESC 

SELECT @FNGS = CASE WHEN @stmt_start > 0                 THEN SUBSTRING(text, (@stmt_start + 2)/2,                       CASE @stmt_end                         WHEN -1 THEN (datalength(text))                         ELSE (@stmt_end - @stmt_start +2)/2                       END)                 ELSE [Text]                 END               FROM ::fn_get_sql(@sql_handle) 

CREATE TABLE #B(eventtype nvarchar(30), parameters int, eventinfo nvarchar(4000))

 INSERT INTO #B(EventType, Parameters, EventInfo)
EXEC ('dbcc inputbuffer (' + @spid + ') with no_infomsgs') 

SELECT @DBIB = EventInfo FROM #B 

SELECT    TOP 1         @FNGS FNGS,         @DBIB DBIB,         cpu,         physical_io,         memusage,         status,         nt_username,         last_batch
from sys.sysprocesses

where spid = @SPID

ORDER BY sql_handle DESC


We also need something that will put a strain on the server to demonstrate the report in action, so I created this ridiculous little SP to run for a while.

/CREATEPROCEDURE dbo.SillyLongRun
  AS
exec dbo.SillyLongRun2
  GO 

/*
  =============================================================================================
CREATE DATE:     04/12/2010
LAST MODIFIED:    04/12/2010
CREATED BY:        SETH PHELABAUM
PURPOSE:        To run for a while.
ISSUES:            Totally Pointless
Notes:            Header here mainly to demonstrate the usage of stmt_start and stmt_end with fn_get_sql.                 This thing sucks up resources, so don't run it on a production box.
=============================================================================================
exec dbo.SillyLongRun2 

*/
  CREATE PROCEDURE dbo.SillyLongRun2

AS 

; WITH
  -- Tally table Gen            Tally Rows:         X2                        X3
t1 AS (SELECT 1 N UNION ALL SELECT 1 N),        -- 4                ,    8
t2 AS (SELECT 1 N FROM t1 x, t1 y),                -- 16                ,    64
t3 AS (SELECT 1 N FROM t2 x, t2 y),                -- 256                ,    4096
t4 AS (SELECT 1 N FROM t3 x, t3 y),                -- 65536            ,    16,777,216
t5 AS (SELECT 1 N FROM t4 x, t4 y),                -- 4,294,967,296    ,    A lot
Tally AS (SELECT CAST(ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) as bigint) N
FROM t5 x, t5 y) -- Change the t3's to one of the other numbers above for more/less rows 

SELECT N, CAST(N as varchar(30)), DATEADD(ms,n,0)
  FROM Tally

(You can call the above with exec dbo.SillyLongRun)

As mentioned in the headers, you would ideally keep a log of when you were notified about things.  Different alerts could be scheduled to have a different frequency.  Perhaps you only want to be notified about certain things once a week, but other things you want to be notified about once an hour until they are taken care of.  This is where a logging table comes in.  I won’t go into that here, but wanted to mention it.

Now that we have the queries, we need to set up the report.  I’m going to assume that you already have Reporting Services set up.  Here is a Screenshot of a very basic report that I created to pull in the data.

ReportSetup

ReportDesignMode

Once you deploy this report, there are a couple more things you need to do before you can create a data driven subscription for it.  The first is to set up a shared Schedule.  Log into your reports server (http://localhost/reports) and go to Site Settings at the top.  Then click on schedules at the left and New Schedule.  For this one I’m just going to create a basic 15m recurring schedule.

SharedSchedule

Next we need to modify the credentials of the shared data source used for the report.  My data source name for the report is SS2K8CL100.  To modify it, I go back to Home –> Data Sources –> SS2K8CL100.  The below screenshot shows me modifying it to use a windows account.

SecuritySettings

Now, we’re ready to create the data driven subscription.  Rather than explain it in text, I’ve taken screenshots of each step of creating a data driven subscription.

DDS DDS2 DDS3 DDS4 DDS5 DDS6 DDS7

Click Finish and you have your report.

image

In closing, I’ll note that I had a lot of problems getting Reporting Services to function correctly on my windows 7 installation, so this isn’t as polished as I would have liked.  I didn’t get the email working and I forgot to include SPID anywhere on the report (pretty useful piece of information to have)

 
2 Comments

Posted in All

 

A T-SQL Holiday Message

03 Apr

Totally pointless, but fun.  Run it to decode the message.

DECLARE @Message VARCHAR(20)
SET @Message = '????????????'

DECLARE @Decode TABLE(
DSeq tinyint,
DKey SMALLINT
)

INSERT INTO @Decode(DSeq, DKey)
VALUES(1,9),(2,2),(3,17),(4,17),(5,26),(6,6),(7,2),(8,20),(9,21),(10,6),(11,19)

; WITH
Decoder AS (
SELECT D.DSeq N,
CHAR( D.DKey + ASCII(SUBSTRING(@Message,D.DSeq,1)) ) DV
FROM @Decode D),
Conc (S) AS (
SELECT DV + ''
FROM Decoder D
ORDER BY D.N
FOR XML PATH(''))

SELECT STUFF(S,6,0,' ') [Surprise]
FROM Conc
 
2 Comments

Posted in All

 

Tally Table CTE

19 Mar

Now that I have several posts on what you can do with a Tally table, I figured I’d share my favorite way to create one inline.  I still prefer to have a physical tally table (usually in a Utility database that can be accessed from anywhere and doesn’t need to be created in each individual database) for permament code, but for times when you need one on the fly, this is my preferred method.  I can’t really take the credit for this query, the base construct is based on something I’ve seen attributed to Itzik Ben-Gan.   I’ve modified it a bit and changed up the formatting to be the way I like it.  Anything over a few thousand rows I’d probably use a physical tally table for, but on small numbers you shouldn’t see much of a performance hit with this script.

-- Tally Table CTE script (SQL 2005+ only)
-- You can use this to create many different numbers of rows... for example:
-- You could use a 3 way cross join (t3 x, t3 y, t3 z) instead of just 2 way to generate a different number of rows.
-- The # of rows this would generate for each is noted in the X3 comment column below.
-- For most common usage, I find t3 or t4 to be enough, so that is what is coded here.
-- If you use t3 in ‘Tally’, you can delete t4 and t5.

; WITH
-- Tally table Gen            Tally Rows:     X2                X3
t1 AS (SELECT 1 N UNION ALL SELECT 1 N),    -- 4            ,    8
t2 AS (SELECT 1 N FROM t1 x, t1 y),            -- 16            ,    64
t3 AS (SELECT 1 N FROM t2 x, t2 y),            -- 256            ,    4096
t4 AS (SELECT 1 N FROM t3 x, t3 y),            -- 65536        ,    16,777,216
t5 AS (SELECT 1 N FROM t4 x, t4 y),            -- 4,294,967,296,    A lot
Tally AS (SELECT ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) N
FROM t3 x, t3 y) -- Change the t3's to one of the other numbers above for more/less rows
 
No Comments

Posted in All

 

Tally Table – Delimited list to Table

27 Feb

Dealing with delimited lists (Usually separated by a comma) in SQL is a problem easily handled by a simple function and a Tally Table.  (Tally tables are also often referred to as Numbers tables or spt_values tables.  If you still don’t know what that is, please see this excellent article on Tally tables written by my friend and SSC heavyweight Jeff Moden.)  This particular implementation is somewhat specific in nature but can give you an alternative to Dynamic SQL when you want to pass in a list as a parameter and do an IN in a Stored Procedure. The following function will take your delimiter and string and parse it into a table so you can do your IN.  (I’m leaving my standard header on the function in this case because there are some good notes in there.)

/*
=============================================================================================
CREATE DATE:     02/27/2010
LAST MODIFIED:    02/27/2010
CREATED BY:        SETH PHELABAUM
PURPOSE:        Splits a string based on a passed in delimiter and returns a table.
ISSUES:            Strings with extra 's will break this function, handle that on the end that calls it.
Notes:            To make it a simpler function, I removed the peice that trimmed spaces around commas.  Do
this before or after calling it.
Revision History:
Date     By        Change Made
-------- ---      -------------------------------------
=============================================================================================
GRANT SELECT ON TVF_TallySplit TO [Somebody]
SELECT * FROM TVF_TallySplit(',','Orange,Apple,Banana,Pear,Watermelon,Grape')
SELECT * FROM TVF_TallySplit('*','Orange*Apple*Banana*Pear*Watermelon*Grape')
DROP FUNCTION TVF_TallySplit
*/


CREATE FUNCTION TVF_TallySplit(
@Delim            CHAR(1),            -- List Delimiter
@String            VARCHAR(8000))
RETURNS TABLE
AS

RETURN(
SELECT SUBSTRING(@Delim + @String + @Delim,N+1,CHARINDEX(@Delim,@Delim + @String + @Delim,N+1)-N-1) ListValue
FROM Tally
WHERE N &lt; LEN(@Delim + @String + @Delim)
AND SUBSTRING(@Delim + @String + @Delim,N,1) = @Delim )

What to do with this
Let’s say you have a table containing names of your favorite fruits.  (In case you were wondering… No, these aren’t my favorite fruits; they were just ones that immediately came to mind when writing this.  I don’t even like half of these.)

CREATE TABLE Fruits(
Name        VARCHAR(25))

INSERT INTO Fruits(Name)
SELECT 'Apple' UNION ALL SELECT 'Banana' UNION ALL SELECT 'Grapefruit' UNION ALL
SELECT 'Kiwi' UNION ALL SELECT'Tomatoe' UNION ALL SELECT 'Grape' UNION ALL
SELECT 'Orange' UNION ALL SELECT 'DragonFruit' UNION ALL SELECT 'Strawberry'

You then ask someone else what their favorite fruits are and want to see what fruits you have in common.  You might think you could just write a query for that like this:

DECLARE @YFFruits VARCHAR(200)

SET @YFFruits = 'Orange,Apple,Banana,Pear,Watermelon,Grape'
-- OR SET @YFFruits = '''Orange'',''Apple'',''Banana'',''Pear'',''Watermelon'',''Grape'''

SELECT * FROM Fruits WHERE Name LIKE (@YFFruits)
-- OR SELECT * FROM Fruits where Name IN (@YFFruits)

However, this won’t work because in all these cases SQL is looking for a single fruit named : ‘Orange,Apple,Banana,Pear,Watermelon,Grape’ not any fruit in what is really a list.

A common solution for this is to use Dynamic SQL which would make your query this:

DECLARE @YFFruits VARCHAR(200)
SET @YFFruits = '''Orange'',''Apple'',''Banana'',''Pear'',''Watermelon'',''Grape'''
EXEC('SELECT * FROM Fruits WHERE Name IN (' + @YFFruits + ')')

This works and will properly match up your fruits.

The above function allows you to accomplish your goal without Dynamic SQL with a query that looks like this:

DECLARE @YFFruits VARCHAR(200)
SET @YFFruits = 'Orange,Apple,Banana,Pear,Watermelon,Grape'
SELECT * FROM Fruits WHERE Name IN (SELECT * FROM Util.dbo.TVF_TallySplit(',',@YFFruits))

You can also simply join to the table instead of using the IN keyword which gives you more flexibility in your query writing.

DECLARE @YFFruits VARCHAR(200)
SET @YFFruits = 'Orange,Apple,Banana,Pear,Watermelon,Grape'
SELECT Fruits.*
FROM Fruits
INNER JOIN Util.dbo.TVF_TallySplit(',',@YFFruits)) T ON Fruits.Name = T.ListValue

Note that because I wanted to keep the function somewhat simple, it does not handle extra spaces around the commas.  Single quotes within the string will also break it which limits its usage somewhat.  If this is a concern for your implementation, you either need to replace the single quotes on both sides or use a different method.  Despite the fact that the example above uses a list of strings, in real life situations I use this mainly for lists of uniqueidentifiers or numbers where single quotes/spaces are never an issue.

 
1 Comment

Posted in All

 

Data Type Precedence and Implicit Conversions

21 Feb

In my last post, I noted that one of the biggest differences between ISNULL and COALESCE was the fact that ISNULL attempted to convert the second parameter to the data type of the first parameter where as COALESCE converted according to the Data Type Precedence table.  A reader requested that I go into more detail on what that means.  At first I wasn’t sure what I was going to explain, there didn’t seem like a lot to talk about once I linked the BOL article on Data Type Precedence(which I meant to do in my initial post but apparently never did).  After thinking about it for a while, I realized that one thing that isn’t really pointed out in the BOL page is what these implicit conversions can do to performance if you aren’t paying attention.   This post got a bit long.

This isn’t a topic that I’m real familiar with, so I had to do some research / tests of my own to write this.  I’ve had to fix the varchar/nvarchar one several times, but others I can’t truly explain.  Why does a char->varchar comparison not trigger an implicit conversion?  Honestly, I’m not sure.  My guess would be that the optimizer is simply smart enough to not do it, but as I said, that’s just a guess.  While attempting to find the answer to this online, I stumbled across a brilliant script written by Jonathan Kehayias that focuses on finding implicit conversions in the plan cache.

The examples below focus on non-numeric conversions.  I did a good amount of testing on different numeric conversions, and although I’ve read that SQL 2000 had specific issues, I was not able to easily duplicate this with the numeric types in any compatibility level with my 2K8 installation (so I left those examples out).  If anyone has any good examples of this behavior with numeric data types, I’d be happy to add them.

Test Setup: (Note that because I am dropping/creating a real table and user defined types, you should be careful which database you execute this against.  I would suggest creating a new one and executing it there)

--- Drop and Re-create User Defined Type
IF EXISTS (SELECT * FROM sys.types WHERE name = 'UDVC') DROP TYPE UDVC
IF EXISTS (SELECT * FROM sys.types WHERE name = 'UDNVC') DROP TYPE UDNVC
CREATE TYPE UDVC FROM VARCHAR(60)
CREATE TYPE UDNVC FROM nvarchar(60)

--- Drop and Re-Create Test Table
IF EXISTS(SELECT * FROM sys.objects WHERE name = 'DTP' AND TYPE = 'U') DROP TABLE DTP
SELECT    TOP 100000 -- If you use too few rows, the performance differences aren't as apparent.
CAST(NEWID() AS nvarchar(60)) NVCCol,
CAST(NEWID() AS VARCHAR(60)) VCCol,
CAST(NEWID() AS CHAR(60)) CCol,
CAST(NEWID() AS sql_variant) SQLVCVarCol
INTO DTP
FROM Util..Tally --I keep a Tally table in a Utility Database named Util.
--Either change to the location of your tally table or use the other FROM statement below.
--FROM master..spt_values A CROSS JOIN master..spt_values B CROSS JOIN master..spt_values C

CREATE INDEX IX_NVCCol ON DTP(NVCCol)
CREATE INDEX IX_VCCol ON DTP(VCCol)
CREATE INDEX IX_CCol ON DTP(CCol)
CREATE INDEX IX_SQLVCVarCol ON DTP(SQLVCVarCol)GO

Tests:

-- Test 1: Compare varchar and nvarchar against varchar column.
-- These will show a massive difference because it must convert the varchar column VCCol to nvarchar to compare them
-- In the execution Plan, you will see that the first uses an index scan and the second uses an index seek.
SELECT VCCol FROM DTP WHERE VCCol = 'A'
SELECT VCCol FROM DTP WHERE VCCol =  N'A'

Test1

-- Test 2: Compare varchar and nvarchar against nvarchar column.
-- These also show a very small difference because nvarchar is higher in the precedence list and so it only has to convert 'A'
-- to an nvarchar rather than the entire column.
SELECT NVCCol FROM DTP WHERE NVCCol = 'A'
SELECT NVCCol FROM DTP WHERE NVCCol =  N'A'

Test2

-- Test 3:  Compare varchar and nvarchar against SQLVariant column.
-- Although this may seem very similar to Test 1, you won't see much of a difference here.  This is because sql_variant is not
-- is near the top of the DTP table (higher than varchar/nvarchar), so you only have to convert the single value.
SELECT SQLVCVarCol FROM DTP WHERE SQLVCVarCol = 'A'
SELECT SQLVCVarCol FROM DTP WHERE SQLVCVarCol = N'A'

Test3

[/cc_sql]– Test 4:  Compare varchar and sql_variant against varchar column.
— Here you get the massive difference you would expect because you must convert the entire column to a sql_variant.
— This one actually has a different plan all together and not just an index scan.
DECLARE @a sql_variant
SET @a = ‘A’
SELECT VCCol FROM DTP WHERE VCCol = ‘A’
SELECT VCCol FROM DTP WHERE VCCol =  @a[/cc_sql]

SQLVariantPar

-- Test 5: Compare User Defined Types (with bases of varchar and nvarchar) against varchar column.
-- This behaves exactly as Test 1 did.  Conversions seem to be handled like they would be if the data types were the base types.
DECLARE @a UDVC
DECLARE @b UDNVC
SET @a = 'A'
SET @b = 'B'
SELECT VCCol FROM DTP WHERE VCCol =@a
SELECT VCCol FROM DTP WHERE VCCol = @b

Test5

--Test 6:  Compare char and varchar against char column
-- There is not any performance loss here.  It seems like there should be, but at least in my tests there is not.
DECLARE @a VARCHAR(60), @b CHAR(60)
SET @a = 'A'
SET @b = 'A'
SELECT CCol FROM DTP WHERE CCol = @b
SELECT CCol FROM DTP WHERE CCol = @a

Test6

The varchar/nvarchar conversions can be especially painful / tricky when the code is being passed in from elsewhere.  One of the places that I’ve seen issues with this is LINQ to SQL.  It is entirely possible that it was just our setup that was mismatched (I wasn’t involved in that end of it), but I figured I’d throw it out there anyways.

 
No Comments

Posted in All

 

ISNULL() VS. COALESCE()

11 Feb

There are a lot of arguments about which of these to use.  In my opinion, they both have their place.  Here are a few facts about the two functions.

  • ISNULL can accept only two arguments where as COALESCE can accept any number of input parameters
  • ISNULL is not ANSI standard and is proprietary to TSQL.
  • With ISNULL, the datatype of the second argument is converted to equal the data type of the first argument where as COALESCE converts according to data type precedence. 
  • ISNULL is generally accepted to be faster than COALESCE.
  • COALESCE is a much cooler word and will usually earn you either an impressed glance or a blank confused stare when slipped into casual conversation. 

Here are a few examples that demonstrate some of the functional differences between the two conversion methods.

declare @a varchar(5)
select ISNULL(@a, 'ISNULL Test 1')        -- Result: ISNUL
SELECT COALESCE(@a, 'COALESCE Test 1')    -- Result: COALESCE Test 1

declare @b tinyint
SELECT ISNULL(@b, 99999)                -- Result: **Error**
SELECT COALESCE(@b, 99999)                -- Result: 9999

declare @c char(5)
SELECT ISNULL(@c, 'a') + 'B'            -- Result: a    B
SELECT COALESCE(@c, 'a') + 'B'            -- Result: aB
 
3 Comments

Posted in All

 

IN and NOT IN

28 Jan

One of the most common mistakes made in T-SQL is thinking that these behave identically.  I’ve personally opened up a forum topic on it because I didn’t know what the difference was.  This post will join a small army of other places on the net devoted to correcting this misunderstanding.

They aren’t completely dissimilar; they behave exactly as you would expect them to… with the exception of NULL’s.  Because nothing EQUALS NULL (Dependent upon settings, see below) the difference in the internal logic matters.  Gail Shaw initially explained this to me when I asked the question on the forums and I wanted to use her explanation here, but I can’t seem to find it; so here’s my own version of an explanation:

When you use IN, you’re really saying "WHERE myvalue = ‘A’ OR myvalue = ‘B’ OR myvalue = NULL"
Your NULLS won’t cause the entire statement to fail because it’s only an OR.

When you use NOT IN you’re really saying ‘WHERE myvalue <> ‘A’ AND myvalue <> ‘B’ AND myvalue <> NULL “
This is where the problem arises.  Since a NULL in SQL is an unknown value, you can’t test = or <> on it and you get no results.  Without the NULL, you’d be fine.

Here’s a simple example to demonstrate.

DECLARE @T TABLE(
Val varchar(5)) 

DECLARE @T2 TABLE(
Val varchar(5)) 

INSERT INTO @T(Val)
SELECT 'A' UNION ALL SELECT 'B' UNION ALL SELECT 'C' UNION ALL
SELECT 'D' UNION ALL SELECT 'E' UNION ALL SELECT 'F' 

INSERT INTO @T2(Val)
SELECT 'A' UNION ALL SELECT 'B' UNION ALL SELECT 'C' UNION ALL
SELECT NULL 

SET ANSI_NULLS ON
SELECT * FROM @T WHERE Val IN (SELECT * FROM @T2)
SELECT * FROM @T WHERE Val NOT IN (SELECT * FROM @T2)
SELECT * FROM @T WHERE Val NOT IN (SELECT * FROM @T2 WHERE Val IS NOT NULL) 

This issue is further complicated by the ANSI_NULLS setting.  While I believe most people have this turned ON, the fact that it is an option introduces another variable into the mix.  NOT IN will not fail in the same way if you have ANSI_NULLS set to OFF.  (Try the above example again after changing ON to OFF)

 
No Comments

Posted in All