Skip navigation

Tag Archives: SQL

Oh what a difference a year makes! Well, to be more precise, 14 months and change… But I digress. This past year has been an exhilarating one for me. I have been very busy on just about every front I could be.

  • My son has moved on from Cub Scouting to Boy Scouting, and I am in the process of getting leadership training so that I can help to facilitate his transition into the awkward teen years.
  • I got engaged over the Christmas holiday, to my best friend, the woman whom I have dated for the past two years, who is the absolute love of my life.
  • I have gone from helping support the OPASS Orlando chapter of PASS to working with the SCSUG here in Brevard County, FL.
  • Last but not least, I have gone from working with one of the biggest healthcare providers in Central Florida to a Federal Government contractor providing health care to our country’s finest in Iraq, Afganistan, Africa, and here at home.

Part of my professional transition that I am feeling most excited about is going from working with some of the most out-spoken and prolific SQL Server professionals back in ’07, to working on a very qualified (some 20+ certifications among the team) team spanning the realms of data, storage, and system administration, to flying totally solo.

I have to admit, it is a little humbling when you walk into a shop, coming from years of working on teams with several other folks -some of the best in the business – in the same arena as you, all by your lonesome and sit down to get to work. The place I am at now has had very bad luck with DBAs. So much so, I found myself apologizing to the hiring manager during the interview because DBAs had given them such a horrible experience. I vowed right then and there to turn that impression around. To make their impression of what we do one of adulation, and appreciation. It hasn’t been easy. To say that they have had bad luck is really being somewhat modest. When I sat down to tackle this environment and reign in the rampant security violations and the configuration horrors, I realized a large part of the reason that they had such a hard time. It looked to me like nothing had ever been done to tame the data beast! Nothing!

Enter everything I have learned over the last six to seven years of being a DBA. Everything.

I cannot tell you how much I appreciate the skills I have been taught by my mentors. I have had to rely on every single ounce of knowledge I have gained, not only from the people I have worked with, but also the people who write amazing blogs, like Jonathan Kehayias, Grant Fritchey, Thomas LaRock, or Ola Hallengren. The ones who pen articles for SQL Server Pro, for Simple-Talk, or for SQL Server Central. The ones who write SQL Server books like “SQL Server 2008 Internals”, “SQL Server MVP Deep Dives” (Vols. 1 and 2), or “Pro SQL Server 2012 Reporting Services” (by some very dear friends of mine). These folks have all had a serious impact in my career, and for that I am eternally grateful.

So, to get back to the title, “I’ve Missed You…” – I have. And I will be looking forward to blogging my experiences with SQL and the characters that I come across in my adventures.


After reading a blog post from Tom LaRock (b|t|g+) from about this time last year, and then another post that Brad Schulz (b) expounded on referring to outliers, I found that both of these posts did very well to explain how to get information on long-running jobs that have completed.

However, I was tasked to alert on long-running jobs that are currently running. Because the sysjobhistory table will not get updated with data until a job step completes, the above approaches would not work in whole for my purpose. They are a great start, and a necessary insight to finding long-running jobs, but they only look at the past. I needed to look at the present.

There can be many reasons why a SQL Server Agent job hangs. Some of the issues resulting from a hung job are outlined here:

  • Resources can be consumed for the duration
  • Object blocking/locking due to open transactions
  • No notifications
  • The list goes on….

On to the solution –

To get the currently running jobs on a server, I executed the sp_help_jobs system stored procedure in the msdb database:

The @execution_status=0 parameter will bring back all jobs that are not idle or suspended. Refer to the link above to see the different status values.

Now that I know which jobs are executing (if any), how do I know what their typical execution time might be, or what the current elapsed execution time is? Enter the sysjobactivity and sysjobhistory msdb system tables. The sysjobactivity table is populated with all currently defined Agent jobs each time the Agent service is started. This table feeds the job activity monitor. The sysjobhistory table contains information on executed jobs. As stated previously, this table is only updated after a job step completes. Perfect for finding the average duration of successful execution of jobs. With this in mind, you can join these tables to get all the information you need to determine current and historical data.

In my case, the sysjobactivity table had two records for each Agent job on one of my servers. This can happen when the SQL Server Agent service terminates while jobs are running. To grab the correct rows, the sysjobactivity table has a session_id column that corresponds to the same column in the syssessions table, which is updated with a new session_id and agent_start_date each time the SQL Server Agent service is restarted.  I joined the sysjobactivity table and the syssessions table, using MAX(session_id) to ensure I got the correct activity record.

Next, I wanted to utilize a “central management” server to run this script. Sure, you could monitor each server separately, but that just adds more jobs for me to watch, and report on in case of failure. Using linked servers and OPENROWSET, I can cycle through each of my SQL Servers, joining a result set from sp_help_job, sysjobactivity, and sysjobhistory. The only question I had here was how to get the server names\instance names that I wanted to cycle through. I declared a table variable, @Servers, and used hard-coded values to insert the data into that table for each server that I wanted to spin through. That method, however, is not exactly scalable. To fix that issue, I pulled the server names\instance names from the sys.servers system view in the master database.

So, now to put it all together, the script looks something like this:

  • Declare all the variables you will need.

Notice the execution of sp_configure to turn on the advanced option “Ad Hoc Distributed Queries.” Security measurements in our shop dictates that this option should be turned off. This block of script will enable the option just for the duration of the job. It will be turned off again, at the conclusion.

  • Populate the @Servers table variable, and get the first Server_ID for iteration. Use a while loop to spin through all the servers in the @Servers table variable, setting the @ServerName variable for use within the server context.

  • With the server context set, create the connection string for the OPENROWSET call, and create the SQL script that will grab all the data you will need to make your comparisons.

The script uses the run_duration column of the sysjobhistory table to calculate an average execution time of successful job runs, and adds a standard deviation to the average. Other posts talked about adding more than one STDEV, but in the case of a backup of a VLDB, I have seen the job take 10 – 13 hrs. Adding two standard deviations would allow for too much variance.  The calculations have to be done remotely because 1) the historical time is an aggregate, and 2) the current time could be in a different time zone than my management server. Also, I am grabbing the job name from the sp_help_job system stored procedure, to limit the amount of joins in the query. In the WHERE clause, I am limiting the AVG to successful executions, and times for the entire job. Finally, insert the results into the @RunningJobTimes table variable to spin through the jobs/server to compare times, one at a time.

  • If there were rows inserted into the @RunningJobTimes table variable, this means that there are jobs running, and we want to compare the current execution time(s) vs. the historical time calculated in the script above.

Here,  we use the @@ROWCOUNT built-in system function to check if any rows were inserted into the @RunningJobTimes table variable. If yes – run the comparison. If not – skip it. The comparison is simple. If the current execution time is greater than the calculated historical time, build an email and send it, using the sp_send_dbmail stored procedure. In order to make the email easily readable on a mobile device, the body of the email is short and sweet. It includes the name of the server first in the subject line, so the DBA can begin triage without having to open the email. The body includes the job name, the time it has been executing, and the amount of overrun.

  • Clear out the @RunningJobTimes table variable, and move on to the next server. After the last server has been iterated, turn off the “Ad Hoc Distributed Queries” advanced option.

This script can easily be turned into a stored procedure, and an Agent job. All you have to do is remove the advanced options blocks, and put CREATE PROCEDURE usp_<procedure_name> AS BEGIN at the top. Be sure to remove the GO at the bottom of the script as well. If your shop is like mine, you can instead put the advanced options blocks into the Agent job as steps one and three, with the stored procedure execution being step two.

I was looking for a topic to write my next blog post on, when I read Thomas LaRock’s (t|b|g+) post about helping the #SQLFamily give even more. Where should I begin?

The #SQLFamily to me, is what the Oracle (no pun intended) was to Neo in the Matrix. It is what Albus Dumbledore was to Harry Potter. It is… well, it is family.

I have run into the SQL brick wall, and summoned #sqlhelp – at seemingly all hours of the day or night – and there has been a response, a quick (witted at times), accurate response. I have visited AskSQLServerCentral to peruse the veritable KB of Q & A for ideas to add to my solutions. I have browsed the SQL forums and articles at and and have found answers to some of the most obscure SQL Server questions and issues.

I have received a wealth of knowledge from the abundance of resources the SQL community has provided, and as many others before me, I have felt the desire to pay it forward. I have volunteered for PASS, ramping up for the annual PASS Summit in Seattle; I have volunteered for some of the various SQL Saturdays that have been held across Florida; I have answered some questions on Ask SQL Server Central, and I started this blog post – all in an effort to try and give back whatever I can to the community that has given so much.

I am rather fortunate to have met a number of the people that I interact with via the “interwebz”. Each and every one of the folks that I have met have been as authentic, and genuine as I could have imagined. Our conversations have been welcoming and informative, and I have gained friends in the process. I have shared thoughts, ideas, and barley with a lot of the folks, and the memories are true treasures.

The #SQLFamily to me, is priceless.

I love to learn new things! As Albert Einstein once said,

“Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning.”

Not to stop questioning… Hmmm, I think I can live with that.

Speaking on questioning, my learning experience today was actually a question a colleague of mine asked, but I was not sure of the answer. The question was, “If you have an SSIS package with configurations enabled, and you are using an XML configuration file, do you have to specify that config file in the Agent job?” I don’t like saying, “I don’t know”, so my answer was, “Give me a few minutes.”

I created a test case, because I had several questions that I wanted to answer. First, I created a quick database with one table that had an integer identity column (force of habit to create a key!), a column that would accept a string of different values, depending on where that value came from, and a column for an import date, again, out of force of habit.

Next, I created a simple SSIS package. One object, an Execute SQL Task, with an insert routine (INSERT INTO Config_Test VALUES (?, GetDate()), using a package variable as the only parameter.

The next thing was to create an Agent job to execute the package. I have not defined any configurations yet, so that I can get results that I am familiar with and expect before I try to retrieve results that I am not as familiar with. As with the SSIS package, the Agent job is a simple, one-step job that will simply execute the SSIS package, nothing more.

When I execute the job, I get the results I expected. The Package Variable’s value is inserted into the test table, and all is well with the world.

So, now that I have the return I expected, let’s see what happens when I enable configurations on the package and not the Agent job…

I enabled configurations (In Control Flow context, SSIS -> Package Configurations…) and then I designated an XML file for my configuration. I only exported the Package Variable’s value to the config file. I saved the SSIS package, and now I want to edit that XML file so that the variable’s value will reflect where the value is coming from.

I do not have to make any changes to the Agent job. When I execute the job again, will the package use the config file, even though it is not designated in the job?

Yes! Even if I do not specify that my SSIS package uses a config file, it will use the configuration file that I enabled within the package. So what is the configuration tab in the Agent job for? I made another configuration file, based on the first, and updated the value accordingly.

This time, I updated the Agent job to use the new, updated XML configuration file, while leaving the configuration file enabled in the package.

In essence, I have two different configuration files designated with this package. The first in the package, and the second, in the Agent job. Now this is a scenario I am curious about. Which configuration will the SSIS package use?

That settles it! If configurations are enabled in an SSIS package, they are not required in the Agent job that runs the package. Furthermore, if a configuration file is specified in the Agent job, that configuration is used in preference to the configuration that is enabled in the SSIS package itself. This could be useful to test for certain different configurations in an Agent job without changing the current configuration designated in the package. It can even be a totally different file.

I love to learn new things!

SQL Saturday 85 is in the record books. A great time was had by everyone that I spoke with, speakers, volunteers, and attendees alike. Shawn McGehee (t|g+|b), Karla Landrum (t|g+|b), Kendal Van Dyke (t|g+|b), and Bradley Ball (t|b) all did a fantastic job putting things together.

The planning committee tried something new this year in Orlando with room proctors to assist the speakers with eval forms, and a “five-minute warning” (not totally unlike the two-minute warning in the NFL, except maybe a little more subtle). All in an effort to assist the speakers, and get a better return on their eval forms which have been somewhat lacking in years prior. I would be interested in hearing from the speakers to see if they liked/disliked this approach, and what – if anything – they would like to see different next year.

Buck Woody (t|g+|b) and some of the folks from Pragmatic Works held Pre-Cons as a prelude to the day of immersion that was SQL Saturday 85. With seven tracks, and over forty speakers, along with no fewer than 23 sponsors, there was plenty of learning to be done, indeed! I tried to attend one session in each track, but the quality and quantity of offerings were so vast, making my decision was not as easy as I had hoped.

I spent the morning listening in on Buck’s session, titled “From Ground to Cloud”, and if there is anything I took away from that session, it was, “Don’t put your encrypted data in my cloud. Don’t even ask!” In reality, there were several good take-aways from his session, and I am looking forward to digging deeper into SQLAzure as it progresses through it’s life-cycle. Secondly, I paid a visit to my friend, Eric Wisdahl (t|g+|b) as he spoke about SSIS data flow buffers. Eric always has great content in his in-depth sessions, and I was especially interested in the performance gains in managing my data flow buffers more efficiently. For my third session, I stopped in to Rodney Landrum’s (t|b) “T-SQL Alphabet Soup”. As the name suggests, this was a fun one! Who knew there was a reserved word for every letter of the alphabet?! Not to mention, it makes for a great drinking game 🙂

Lunch was another new concept, with the speakers serving up some of the best BBQ in Orlando, catered by Keller’s BBQ. Here, the planning committee got a little creative and had some fun, too! The speakers all donned their aprons (some got a little confused and thought they were super-hero capes, but we didn’t tell them any different!) and suddenly they all became chefs. Jorge Segarra (t|g+|b) was the lone cannibal in the group serving up BBQ’d @SQLChicken to attendees, much to his chagrin, after he learned of his brethren’s fate 😮

Several vendors offered up mini-sessions during the latter-half of the lunch break, and the response I heard was that they were packed! This is great news, given the time and money our sponsors offer to help us hold these events. Being receptive and appreciative to them will almost guarantee a return visit!

Finally, after lunch, I stopped in to check out Bradley Schacht’s (t|g+|b) presentation on dynamic SSIS. (There was even a little slap-schtick comedy tossed into the mix with this one, how does that face feel, Jorge?!) This session was of particular interest to me, because I helped to develop an SSIS configuration database standard at a previous position, and any insight into how some of the best in the business are doing this is greatly welcomed. It was quite reassuring to know that we were in agreement on a vast majority of best practices with configurations in SSIS, and I learned that the fate of configurations in Denali will leave me a bit of a learning curve. I am quite OK with that, since I love to learn, and this career is one learning opportunity after another! In my final session of the day, I could not resist checking out Mike Davis (t|g+|b) and Adam Jorgensen’s (t|g+|b) SQL Smackdown (with @SQLBalls as the totally un-biased referee ;)) As is usual with the crew from Pragmatic Works, their session was as much entertainment as it was enlightenment. Jabs were thrown, pic were shown, and in the end, Adam’s T-SQL solution (even without being executed… un-biased, remember?) came out the victor! Not to be out done, Mike stormed off in disgust, and then laid a smack-down SSIS-style on the ref!

Andy Warren (t|b) stayed true to tradition, with his annual “book drop” at the conclusion of the event, and each of the sponsors had some really great raffle prizes to give away to a few lucky attendees. Noticeably absent, however,  was our good friend Jack Corbett (t|g+|b) who moved back up north earlier this year.

All-in-all, it was a great day of SQL learning. The weather couldn’t have been more perfect, the lunch was the best I have ever had for $5, the speakers and sponsors were top-notch, and the venue was absolutely beautiful. There are plenty of pictures to peruse, so please, help yourself. See you at the next SQL Saturday!

(As a side note, this is my very first published blog post. There will certainly be more down the road, but I would like any constructive criticism that any readers would like to leave. I’m jumping in with both feet and not looking back!)