Prepared T-SQL Statements

Dave Mason - SQL Server - Prepared Statements

I recently started reading Grant Fritchey's book SQL Server Execution Plans. I didn't get too many pages deep before the topic of query parameterization and prepared statements came up. As Grant notes, prepared statements "are built from the application code". Prepared T-SQL statements in SQL Server have been somewhat mysterious to me for many years. It's something I felt I should have a better understanding of. I thought to myself, "I'm a developer. I should be able to figure this out." So I decided to go down the rabbit hole and see if I could cobble together an example of a prepared statement using C#.

I quickly found my way to Microsoft documentation for the SqlCommand.Prepare method. To my surprise, that method dates all the way back to version 1.1 of the .NET Framework, which was released in 2003. I was a new C# developer at that time. I didn't know about the method then. And as you're probably figuring out, I just learned about it now.

Before we dive into some code, I want to make a plea to the developers that may be reading. If you arrived here in search of a "How-To" article, please stop and reconsider. In my opinion, prepared statements via sp_prepexec and sp_executevare a poor choice for query parameterization. There are better options, including using stored procedures, parameterizing in T-SQL with sp_executesql, and parameterizing ad-hoc queries with the SqlCommand object. I'm writing from the perspective of a DBA, sharing what I've learned with other DBAs.

OK, DBAs. Are you ready for some C# code? I'll keep it simple and painless. Here's what the code is going to do: connect to SQL Server and insert three rows to dbo.Stooges. That's it. I'll use an ad-hoc query that is parameterized--there should be no risk of SQL injection. The query will be re-used for all three inserts. And SQL Server will use the same execution plan for each insert. Here is the code:


using (SqlConnection conn = new SqlConnection(GetConnectionString()))
{
    conn.Open();

    using (SqlCommand cmd = new SqlCommand())
    {
        cmd.Connection = conn;

        // Create and prepare a SQL statement.
        cmd.CommandText =
            "INSERT INTO dbo.Stooges (StoogeName, StoogeNumber) " +
            "VALUES (@name, @number)";
        SqlParameter parName = new SqlParameter("@name", SqlDbType.Text, 128);
        SqlParameter parNumber = new SqlParameter("@number", SqlDbType.TinyInt, 0); 
        parName.Value = "Larry";
        parNumber.Value = 1;
        cmd.Parameters.Add(parName);
        cmd.Parameters.Add(parNumber);

        // Call Prepare after setting the Commandtext and Parameters.
        cmd.Prepare();
        cmd.ExecuteNonQuery();

        // Change parameter values and call ExecuteNonQuery.
        cmd.Parameters[0].Value = "Curly";
        cmd.Parameters[1].Value = 2;
        cmd.ExecuteNonQuery();

        // Change parameter values and call ExecuteNonQuery again.
        cmd.Parameters[0].Value = "Moe";
        cmd.Parameters[1].Value = 3;
        cmd.ExecuteNonQuery();
    }
}

At line 1 a connection to SQL Server is created. The connection string identifies the connectivity info. In this example, GetConnectionString() is a user-defined method that builds and returns our connection string. I've omitted the code for that method for brevity. At line 3 the connection is opened.

At line 5 a SqlCommand object is created. You can think of this as a "query" for SQL Server. The query string for the INSERT statement is created starting at line 10. You'll notice the @name and @number placeholders for the parameter values. Next on lines 13 and 14, we create SqlParameter objects for @name and @number. Each SqlParameter has a parameter name, a SqlDbType (aka a data type), and a size. The SqlParameter variables are identified as parName and parNumber.

At lines 15 and 16, parameter values are set in anticipation of our first INSERT for Larry. Then at lines 17 and 18, parName and parNumber are each added to cmd.Parameters, which is a collection of zero or more SqlParameters.

Line 21 is where we "prepare" our query. Specifically, it creates a prepared, reusable version of the command (our INSERT statement) on an instance of SQL Server. Line 22 INSERTs our first stooge by invoking cmd.ExecuteNonQuery(). Most of the hard work is done now. Adding more stooges gets easier from here on out.

Our SqlCommand, cmd still has two parameters in the collection. But they contain the values for Larry. On lines 25 and 26, we assign new values for Curly, who is INSERTed on line 27 where we once again invoke cmd.ExecuteNonQuery(). The process is repeated for Moe on lines 30 - 32.

And that's it from the application perspective. There's not really that much application code involved. If you were monitoring this activity via an Extended Events session (or a Profiler trace) it would look similar to this:

Dave Mason - SQL Server - Prepared Statements

I'd seen stored procedures sp_prepexec and sp_execute during monitoring here and there throughout the years. But I never had a good grasp on how those stored procs work or what they were doing. Now I have a much better idea.

What about the execution plan(s)? How many did the prepared statement actually use? A quick look at sys.dm_exec_query_stats tells us that as advertised, only one execution plan was used.

Dave Mason - SQL Server - Prepared Statements


Introducing SQL-to-Excel

SQL Server - Dave Mason - SQL to Excel

In 2018, I've found myself frequently running a series of static SQL Server queries and copying/pasting the output of each query into a separate sheet in a Microsoft Excel Worksheet file. They say if you perform a manual task X number of times or more, you should automate it. Great advice, right? There are a number of ways to export SQL Server query results to Excel, and an internet search readily turns up many options. I didn't find any that suited my needs, so I wrote my own utility, uncleverly named "SQL to Excel".


Binary to Decimal via STRING_SPLIT

Colin Stasiuk posed an interesting question on Twitter recently:

I started pondering it for a bit and began to wonder if I could use the new for SQL Server 2016 STRING_SPLIT function to convert a binary string to decimal. The thought process was to split the string into rows of CHAR(1) values, along with an in-string character position. Then I could take the "1" values, calculate the 2แตก value, and sum them up for a decimal value.


WAITFOR DELAY - Too Much TIME DATETIME On My Hands?

Dave Mason - SQL Server - WAITFOR DELAY

There are certain design patterns in T-SQL that give me pause. They may not be "code smells" per se, but when I encounter them I find myself thinking "there's got to be a more sensible way to accomplish this". WAITFOR DELAY is one example. I've used it a few times here and there, mostly in one-off scripts. If I wrote some code that used it regularly in production, I'd be paranoid about putting my SPID to sleep forever. Maybe a little bit of paranoia is a good thing. But I digress.


Dynamic SQL, NULL, and EXEC

Dave Mason - SQL Server - TSQL - EXEC

I've been a fan of dynamic T-SQL for a long time. I ran into something new, though, that threw me for a loop. It started with a stored procedure that builds a dynamic query string, based on a dozen or so input parameters that are all optional. If an input parameter is NULL, it gets excluded from the query logic. Here's a typical example with three parameters:


T-SQL Tuesday #106 - Triggers

Dave Mason T-SQL Tuesday

For T-SQL Tuesday #106, Steve Jones (b|t) asks us to write about an experience with triggers, either good or bad. I'd bet that most of you are thinking about INSERT, UPDATE, and DELETE triggers. But for my DBA sensibilities, the best trigger is a DDL trigger.


Preparing for Exam 70-767: Implementing a Data Warehouse

Dave Mason MCSE: Data Management and Analytics

After nearly a two year hiatus, I steered my learning path back towards certifications. I'm happy to tell you I recently passed Exam 70-767: Implementing a Data Warehouse using SQL. If you don't know, there's a non-disclosure agreement for Microsoft exams, so I can't divulge particulars about the exam. But as Cathrine Wilhelmsen (b | t) has done, I wanted to share my experience studying and preparing for the exam.


What's Your Learning Style?

Dave Mason - Certification - Learning Style

I see the "What's your learning style?" question posed on social media from time to time, mostly from people in the SQL Server community. I rarely think about it, but I've been giving it some thought lately. In my completely unscientific estimation, when people in technical fields discuss their preferred style, the runaway top responses are learning by doing and learning by studying.


tempdb: Are We There Yet?

Dave Mason SQL Server tempdb

As a global resource, it can be challenging to manage [tempdb]. It's a broad topic, so for this post, I'll just focus on these two scenarios that leave [tempdb] out of space:

  • Maximum file size is limited and there is no more free space to allocate new pages.
  • Files are configured for unlimited growth and have consumed all available disk space.

Queries to obtain information for either scenario are reasonably simple: there's the sys.dm_os_volume_stats dynamic management function, the FILEPROPERTY metadata function, and even the (unsupported) extended procedure xp_fixeddrives to name a few. Running queries on demand is a good place to start, but you'll probably want to have some automation, right? A tactic I have used (and that I've seen frequently) is to set up a SQL Agent Job that runs one or more queries and takes action, based on the query outcome.

Naturally, the job runs on a predefined schedule. But how frequently should we check disk/available space for [tempdb]? The temporary nature of [tempdb] makes this a difficult question: objects within aren't saved from one session of SQL Server to another, and evidence to explain runaway growth or loss of available space may be gone before an assessment can be made. Whatever schedule I decide on, I'll always wonder if it's frequent enough (or too frequent).

It's tempting to "over-schedule" a job's frequency, perhaps as much as every X seconds. Asking SQL Server "Are we out of disk space?" over and over again doesn't make a lot of sense to me, though. It reminds me of Bart and Lisa asking Homer "Are we there yet?" until he snaps. Ideally, instead of asking, I want SQL Server to *tell me* when disk/available space is running low.


Performance Condition Alerts

For [tempdb] data files that have a set maximum file size, a performance condition alert for the "Free Space in tempdb (KB)" counter can be created. It's part of the "Transactions" object. (On a side note, trying to find an appropriate counter by exploring what's available in each object, one at a time, is tedious. It's so much easier to query the sys.dm_os_performance_counters DMV.) In the example below, the alert fires if free space falls below 102400KB/100MB. The alert responds by executing a job "Log tempdb Space Usage Stats", and has a five minute delay between responses.

Dave Mason - SQL Server Agent Performance Condition Alert Dave Mason - SQL Server Agent Performance Condition Alert Dave Mason - SQL Server Agent Performance Condition Alert

This approach works well for [tempdb] files that are configured with an initial size equal to their maximum size. With no possibility for auto-growth, you'd decide on the free space threshold amount, do a little math to convert it to kilobytes, and enter the value on the General page of the Alert dialog. If [tempdb] configuration and available disk space allow for future auto-growth events, take into consideration the alert may fire more often than desired.


WMI Alerts

For [tempdb] data files configured with an unlimited Maximum File Size, a WMI alert can be configured for the DATA_FILE_AUTO_GROW event. In the previous example, the alert (or more specifically, the counter) only applies to [tempdb]. But the DATA_FILE_AUTO_GROW event applies to data files for *all* databases. In the example below, the WQL query will restrict the alert to just [tempdb] via a WHERE clause. Like the first example, the alert responds by executing the job "Log tempdb Space Usage Stats", and has a five minute delay between responses.

Dave Mason - SQL Server Agent WMI Alert Dave Mason - SQL Server Agent WMI Alert Dave Mason - SQL Server Agent WMI Alert

This approach works well for [tempdb] files that are configured with room for growth. Note the event fires once per data file growth event. With the likelihood of multiple data files, the "Delay between responses" remains important. Also, if [tempdb] data files grow enough to consume all available disk space, the last alert might occur long before [tempdb] runs out of available space internally. As an alternative to a WMI event, an Event Notification can be configured for the DATA_FILE_AUTO_GROW event.


What to Log

Depending on how [tempdb] data files are configured, it makes sense to check either available disk space or available space within [tempdb] itself (or both) when an alert fires. In addition, determining what is consuming so much space in [tempdb] makes sense too. Microsoft's documentation includes some really good queries for monitoring [tempdb] use. Sunil Agarwal also has an article I found helpful. (It's an oldie, but a goodie.) Additionally, I've found these DMVs useful:

  • sys.dm_db_file_space_usage
  • sys.dm_db_task_space_usage
  • sys.dm_db_session_space_usage
  • sys.dm_tran_session_transactions


SQL Server Features Discovery Report

Dave Mason SQL Server Features Discovery Report

I don't need to validate SQL Server installations on a regular basis. When the need arises, my preference is to run the SQL Server features discovery report. Further, I prefer to run it from the command line. After looking up command line parameters one too many times, I decided to script it out.