Friday, June 22, 2012

The Distribution Agent for a Transactional Publication Will Ignore the MaxBCPThreads Setting by Default

The company I work for hosts a SaaS solution for multiple customers, and we utilize transactional replication to push a copy of the customer's data from our SQL servers to a SQL server on-site at the customer's location.  This allows much greater reporting flexibility without impacting the performance of the production SQL Server.

We are currently migrating our largest customer to a new SQL server (which requires us to set up replication from scratch and generate/push a new snapshot to initialize the new transactional publications).  By default, SQL Server ignores our setting for
MaxBCPThreads in the Distribution Agent profile and will only push the BCP files serially.  With this customer we have a large pipe between our data center and the customer's location (on the opposite coast), but serial snapshot delivery only allows us to consume about 10% of the available bandwidth (even with the maximum packet size option of -PacketSize 32767 in the command line of the Distribution Agent job).

In short, to address the issue we needed to recreate the publication and specify the sp_addpublication argument @sync_method = 'native'.  However, the reason why is a bit obscure.

According to SQL Server 2005, 2008 and 2008 R2 Books Online, the default @sync_method is 
character for snapshot publications (non-concurrent character mode BCP output) and concurrent_c (concurrent character mode BCP output) for all other publication types. However, in my experience, the actual defaults are native (non-concurrent native mode BCP output) for snapshot publications and concurrent (concurrent native mode BCP output) for all other publication types.  This is actually a good thing as the native SQL BCP snapshot format is faster/more efficient to deliver to the subscriber. The character formats are only necessary for publications with non-SQL Server publishers.

The default in SQL Server 2000 was native and according to SQL Server 2012 Books Online, the default has been changed back to native for all SQL Server publications.

So what is the difference between native and concurrent?  Books Online indicates that utilizing the concurrent option "Produces native-mode bulk copy program output of all tables but does not lock tables during the snapshot."  Whereas, the native option locks the tables for the duration of the snapshot generation.  However, this is not the whole story. When using the concurrent option the tables are not locked for the entire duration of the snapshot generation, but the very last step of the Snapshot generation process is to lock the tables to capture the delta. This may still cause blocking (it does in our environment), but the impact should greatly reduced as the tables are locked for a much shorter period of time.

So what does this have to do with the snapshot delivery? In addition to affecting the snapshot generation, the concurrency setting also affects snapshot delivery at the subscriber. When using the concurrent sync_method, the Distribution Agent will ignore the MaxBCPThreads setting (in the Agent Profile) and deliver the BCP files in the snapshot serially. With smaller databases this is not a major issue. However, in my case, I'm trying to push snapshots that are around 100GB and serial delivery of the BCP does not take advantage of the amount of bandwidth we have between our site and our customer's site.  From SQL Server Books Online: When applying a snapshot that was generated at the Publisher using the concurrent snapshot option, one thread is used, regardless of the number you specify for MaxBcpThreads.

By utilizing the native snapshot process (and increasing the MaxBCPThreads value from the default of 1 to 16), I can now push 16 BCP files simultaneously to the subscriber, thus taking full advantage of the large pipe between our site and our customer's site.  There is no upper limit to what MaxBCPThreads can be set to, but you don't want to set it too high and overwhelm the CPUs on the distributor or subscriber.  I've successfully tested 32 simultaneous threads on a system with only 16 cores (no hyper-threading).

Note: You cannot change the concurrency setting of a publication on the fly.  You will need to drop and recreate the publication via a T-SQL script.  You can use one of the following methods to script the publication creation:
  • Right-click the existing publication in SQL Server Management Studio and choose Generate Scripts... (I usually output to a new Query Editor window and then save the script once I've made the appropriate changes)
  • Go through the GUI to set up the publication and choose Generate a script file with steps to create the publication (save the file to your hard drive and then open with SSMS once script generation is complete)
Once you have the script, change @sync_method = 'concurrent' to @sync_method = 'native'.  You will also want to verify that any SQL Server authentication passwords are correct (for security purposes, SQL Server will not extract the passwords from the existing publication).


Resources used in my research to address this issue: 

I'd specifically like to thank Hilary Cotter for providing a very swift answer to my dilemma.  His contribution to the SQL Server community is greatly appreciated!

Friday, April 27, 2012

BCP Command Line Utility Installation Process


SQL Server 2012 BCP command line utility installation

Recently, I needed to install the BCP utility on our job automation servers.  These servers do not need the full suite of SQL Server Workstation Tools, just the BCP.exe command line utility.  I found the following process worked quite well at only installing what was needed (thus saving around 1 GB of disk space by not installing the Workstation Tools).

The following components need to be installed on a standalone server in order to get the BCP utility functioning.


Since the servers I was dealing with were Windows Server 2003, I needed to install the Windows Installer 4.5.  Newer versions of Windows Server (i.e. Windows Server 2008 R2) already have the newer Windows Installer.  If installation of the Windows Installer is needed, a reboot will be required.

Installing the Native Client and Command line utilities are as simple as just running the installers.  A reboot is not required for either the Native Client or Command Line Utilities.

These instructions are specifically for the SQL Server 2012 (and 2008 R2) BCP utility, but should be able to be adapted for other versions of SQL Server 

That's it!  Once the aforementioned components are installed, the command line BCP utility should work beautifully.

Wednesday, April 25, 2012

Replication Troubleshooting - How to deal with out of sync publications

Transactional Replication and nasty errors that cause out of sync publications.

The other day we had an issue on our distributor that caused deadlocks on the Distribution database.  Several of the Log Reader Agents suffered fatal errors due to being chosen as the deadlock victim.  This caused the following error to occur:
  • The process could not execute 'sp_repldone/sp_replcounters' on 'MyPublisherServer'
When I drilled in to view the detail, I found this error:
  • The specified LSN (%value) for repldone log scan occurs before the current start of replication in the log (%newervalue)


After much searching on the error, I came across several forum posts that indicated I was pretty well up a creek.  I then found this post on SQLServerCentral.  Hilary Cotter's response was the most beneficial for devising a recovery plan and Stephen Cassady's response helped me refine that plan.

Hilary Cotter (Blog) is an expert when it comes to SQL replication.  He certainly knows his stuff!


The Recovery Plan
Recovering from this issue involves several steps.  

For small databases or publications where the snapshot to reinitialize the publication will be small and push quickly, it's simplest and best to just reinitialize the entire publication and generate/push a new snapshot.  

For larger publications (my publication contained almost 1,000 tables) and situations where pushing the snapshot will take an inordinate amount of time (24+ hours in my case) the following process can be used to skip the missing transactions and identify the tables that are now out of sync:
  • Recover the Log Reader Agent by telling it to skip the missing transactions
  • Recover the Distribution Agent by configuring it to ignore data consistency issues
  • Validate the publication to determine which tables are out of sync
  • Drop and republish out of sync tables


Log Reader Agent Recovery
The simplest way to recover the Log Reader Agent is to run the following command against the published database:
  • sp_replrestart
This effectively tells SQL to restart replication NOW, thus ignoring all transactions that have occurred between the time of the failure and the time you run the command.  The longer you wait to run this command, the more activity in the database that gets ignored, which likely results in more tables that fall out of sync.


Distribution Agent Recovery
Now that the Log Reader Agent is capturing transactions for replication, the Distribution Agent will likely get upset because there are transactions missing.  I specifically received the following error:
  • The row was not found at the Subscriber when applying the replicated command
This error causes the Distribution Agent to fail, but there is a system profile for the Distribution Agent that you can select to bypass the data consistency errors.
  • Launch Replication Monitor
  • In the left-hand column
    • Expand the DB server that contains the published database
    • Select the Publication 
  • In the right-hand pane
    • Double-click the Subscription
  • In the Subscription window
    • Go to the Action menu and select Agent Profile
    • Select the profile: Continue on data consistency errors. and click OK
      • Be sure to note which profile was selected before changing it so that you can select the appropriate option once recovery is complete
  • If the Distribution Agent is currently running (it's likely in a fail/retry loop), you'll need to:
    • Go to the Action menu and select Stop Distribution Agent
    • Go to the Action menu and select Start Distribution Agent
  • If there is more than one subscription, repeat these steps for any additional subscriptions


Subscription Validation
Validating the Subscription(s) is a fairly straightforward task.
  • Launch Replication Monitor
  • In the left-hand column of Replication Monitor
    • Expand the DB server that contains the published database
    • Right-click the Publication and select Validate Subscriptions...
    • Verify Validate all SQL Server Subscriptions is selected
    • Click the Validation Options... button and verify the validation options - I recommend selecting the following options:
      • Compute a fast row count: if differences are found, compute an actual row count
      • Compare checksums to verify row data (this process can take a long time)
    • Once you are satisfied with the validation options, click OK and then click OK to actually queue up the validation process
      • Please note: for large databases, this process may take a while (and the Validate Subscriptions window may appear as Not Responding)
For my publications (~1,000 tables and DB was ~100GB) the validation process took about 20 minutes, but individual results will vary.
If you wish to monitor the validation progress
  • In the right-hand pane of Replication Monitor
    • Double-click the Subscription
  • In the Subscription window:
    • Go to the Action menu and select Auto Refresh


Identify out of sync tables
I created the following script that will return the tables that failed validation:

-- This script will return out of sync tables after a Subscription validation has been performed
-- Set the isolation level to prevent any blocking/locking 
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;

SELECT 
mda.publication [PublicationName],
mdh.start_time [SessionStartTime],
mdh.comments [Comments]

FROM distribution.dbo.MSdistribution_agents mda 
JOIN distribution.dbo.MSdistribution_history mdh ON mdh.agent_id = mda.id 

-- Update Publication name as appropriate
WHERE mda.publication = 'My Publication'
AND mdh.comments LIKE '%might be out of%'
-- This next line restricts results to the past 24 hours.
AND mdh.start_time > (GETDATE() - 1) 
-- Alternatively, you could specify a specific date/time: AND mdh.start_time > '2012-04-25 10:30'
-- View most recent results first
ORDER BY mdh.start_time DESC

The Comments column will contain the following message if a table is out of sync:
  • Table 'MyTable' might be out of synchronization.  Rowcounts (actual: %value, expected: %value).  Checksum values  (actual: -%value, expected: -%value).
Make a list of all tables that are returned by the aforementioned script.

Now the determination needs to be made as to the level of impact.
  • The Reinitialize All Subscriptions option should be used if the following is true:
    • Large number of tables affected (majority of published tables)
    • Unaffected tables are small in size (if the snapshot for the unaffected tables is going to be very small, it's much easier to just reinitialize everything)
  • Dropping and re-adding individual tables should be used if the following is true:
    • The number of tables affected is far less than the total number of tables
    • The tables that are unaffected are very large in size and will cause significant latency when pushing the snapshot
The latter was the case in my scenario (about 100 out of 1,000 tables were out of sync, and the ~900 tables that were in sync included some very large tables).


Reinitialize All Subscriptions
Follow this process if the determination has been made to use the Reinitialize All Subscriptions option:
  • In the left-hand column of Replication Monitor
    • Expand the DB server that contains the published database
    • Right-click the Publication and select Reinitialize All Subscriptions...
    • Verify Use a new snapshot is selected
    • Verify Generate the new snapshot now is NOT selected
    • Click the Mark For Reinitialization button
      • Please note: for large databases, this process may take a while (and the Replication Monitor window may appear as Not Responding)
  • In the right-hand pane of Replication Monitor
    • Select the Agents tab (in SQL 2005 select the Warnings and Agents tab)
    • Right click the Snapshot Agent and select Start Agent
      • The reason for performing this manually is that sometimes when you select the Generate the new snapshot now option, it kicks off the Snapshot Agent before the reinitialization is complete which causes blocking, deadlocks and major performance issues.

Recover out of sync tables
If the determination has been made to recover the individual tables, use the list of tables generated from the validation process and follow this process:
  • In the left-hand column of Replication Monitor
    • Expand the DB server that contains the published database
    • Right-click the Publication and select Properties
    • Select the Articles page in the left-hand column
    • Once the center page has populated, expand each table published to determine if the table is filtered (i.e. not all columns in the table are published).
      • If tables are filtered, make a note of the columns that are not pushed for each table
    • Once review of the tables is complete, click Cancel
      • If you click OK after expanding tables, it will invalidate the entire snapshot and you will end up reinitializing all articles in the publication
    • Right-click the Publication and select Properties
    • Select the Articles page in the left-hand column
    • Clear the check boxes for all out of sync tables and click OK
    • Right-click the Publication and select Properties
    • Select the Articles page in the left-hand column
    • Select the affected tables in the center pane 
      • If any tables were not completely replicated, be sure to reference your notes regarding which columns are replicated
    • Click OK when table selection is complete
      • Note: If you receive an error that the entire snapshot will be invalidated, close the Publication Properties window and try adding in a few tables at a time until all tables are selected.
    • In the right-hand pane of Replication Monitor
      • Select the Agents tab (in SQL 2005 select the Warnings and Agents tab)
      • Right click the Snapshot Agent and select Start Agent
    • Double-click the Subscription
    • Go to the Action menu and select Auto Refresh

Final cleanup
Once the snapshot has been delivered and replication has caught up on all queued transactions, perform the following to return replication to a normally running state.
    • In the left-hand column of Replication Monitor
      • Expand the DB server that contains the published database
      • Select the Publication 
    • In the right-hand pane of Replication Monitor
      • Double-click the Subscription
    • In the Subscription window
      • Go to the Action menu and select Agent Profile
      • Select the profile that was configured before you changed it (if unsure, the Default agent profile is typically the default) and click OK
    • If there is more than one subscription, repeat these steps for any additional subscriptions


I hope this helps if you run into the same situation.  I would like to especially thank Hilary Cotter for sharing his knowledge with the community as his forum and blog posts really helped me resolve the issue.

Saturday, December 10, 2011

My PASS 2011 Experience - Part 4 - The Conclusion of a Surreal Saga


My PASS 2011 Experience - Knowledge gained, vendor prizes, and meeting more people.

You will find all kinds of vendors in the vendor hall, and pretty much all vendors give away some type of swag.  Several vendors gave away an iPad 2 this year and there were also plenty of other really cool prizes.  The only thing you need to do to be eligible for stuff (especially the big prizes) is to let them scan your badge.  No big deal.  Yes, you'll get some emails and phone calls from the vendors looking to drum up some business, but it's a small price to pay to be eligible for some awesome prizes.  Spend some time meeting the vendors and you'll not only learn about cool companies and products, you may even be pleasantly surprised with taking home a sweet prize.  At the very least, you'll have a blast meeting new people among the massive throng of folks wandering around all the vendor booths!

Thursday, October 13th. 
I got up early on Thursday to head down to the hotel lounge with Mike Walsh (Blog - @Mike_Walsh on Twitter), Joe Webb (Website@JoeWebb on Twitter), Andy Leonard (Blog - @AndyLeonard on Twitter) and about 20 others to worship and pray as part of the #PASSPrayers group.  Rob Farley (Blog - @Rob_Farley on Twitter) played the guitar for worship.  It was a great time of fellowship and the hour went by very quickly. 

Thursday was the day to recognize Women In Technology.  It was also SQL Kilt day.  Started by Grant Fritchey (Blog - @GFritchey on Twitter), Steve Jones (Editor of SQLServerCentral.com - Blog - @Way0utWest on Twitter), and Bill Fellows (Blog - @billinkc on Twitter) @ the PASS Summit in 2009, it's a way for the guys to show their support for the women that work in the technology field.  Participation has grown in the past two years, and even several vendors got in on the act by wearing kilts on Thursday.  It was great!

Bill Graziano wearing a Kilt on Thursday (many thanks to Brent Ozar for snapping this excellent pic)



I had breakfast in the hotel, hit the keynote, and then made my way to a session on branding yourself by Steve Jones.  I found this session (Branding Yourself For The Dream Job) to be chock full of great information and it is the primary reason I'm blogging today.  This session covered so much more than just how to look for a new job.  I'm not currently looking for a new job, but I was able to use information from this session to set some goals for myself and update my resume (which I hadn't touched since I landed my current job almost 6 years ago).  Again, community is the key.  It's not just about trying to get your name out there so people will recognize you, it's about actively contributing to the SQL family.  I chatted with Steve for a bit after the session (including thanking him profusely) and then made my way to lunch.  

After lunch, I finished registering for all the cool prizes I could find in the vendor hall.  Dell offered a free 10 minute chair massage, which I definitely took advantage of.  I was also able to meet with the president of my local PASS chapter and I joined on the spot.

That afternoon, HP had an NDA presentation on their partnership with Microsoft to co-develop a data appliance.  I found it very interesting, but it was Large Enterprise level technology that I won't get to play with.  They had a drawing for an HP Mini if you submitted an evaluation form.  I filled out a form, but I wasn't even thinking about winning the HP Mini.  I was working on collecting and organizing cards for another HP prize drawing when they called my name.  It took me several seconds to realize that they didn't have a question for me, but that I had won the HP Mini.  To say I was excited would be a vast understatement.  I gave the HP Mini to an old college friend that I hadn't seen in 12 years.  He needed it far more than I.

Early Thursday evening were all the vendor drawings.  Before I go into too much detail, I need to stop and give kudos to the PASS Summit planning team.  PASS does not allow vendors to do their prize drawings simultaneously.  The vendor drawings @ TechED often occur simultaneously and thus you have to choose between vendors (and may miss out on a drawing from another vendor).  By requiring sequential prize drawings, PASS has ensured that everyone has a fair chance at winning prizes from every vendor.  This also results in a throng of people that make their way to each vendor booth for the prize drawings.  It's great fun!

Back in the vendor hall, I was hoping to score an iPad 2 (thank you brilliant marketing folks at Apple).  Considering the fact that I had already won the HP Mini, I really didn't expect to win anything.  I was one of the first people to reach the CA booth (I had been at the very back of the crowd at the previous vendor booth), so I walked right up to the prize table and started looking over the box for their grand prize (a Samsung Galaxy Tab 10.1 - 32GB).  I got some funny looks from the ladies in the CA booth, so I explained that I was just looking at the specs.  As their looks continued, I joked "You should just take my picture now and get it over with."  They laughed, I laughed, and I put the box back on the table.  CA had a couple books up for prizes as well, and they drew names for those first.  Then they called my name for the Galaxy Tab.  I threw my arms up, jumped in the air and shouted "YES!"  The nice ladies that had given me funny looks before now wore a new expression on their face, pure shock.  I was so excited that I forgot to actually hold up the box for the tablet when they went to take my picture (I finally did at their request).  Oops, sorry CA!

After the vendor prize drawings were complete, I headed to a SQLCAT session on HA and Always On technology in SQL Server 2012.  Very exciting stuff coming in the next release of SQL Server.

I hit Gameworks Thursday evening for the Community Appreciation Party (sponsored by PASS and Microsoft).  Playing video games, eating food, and hanging out with other SQL geeks consumed the evening very quickly!

Friday, October 14th. 
Things started to wind down on Friday with a steady exodus of people all day.  Many thanks to my employer for allowing me to attend the PASS Summit in its entirety.

Again, I got up early on Thursday to head down to the hotel lounge for #PASSPrayers.  It was another great time of worship, prayer and fellowship.  

After breakfast I made my way to the keynote by Dr. David Dewitt.  Wow!  Wednesday's keynote was cheesy. While Thursday's keynote was better, it still didn't completely hold my interest.  However, Dr. Dewitt's keynote on Friday was phenomenal!  I actually left the keynote really excited about big data (which is not something I have any opportunity to work with).  Dr. Dewitt's presentation was riveting and you could just feel the excitement in the room as everyone seemed to click with what he was saying.  This is the perfect example of having the right person for the keynote.  Dr. Dewitt knows his stuff, was able to provide a lot of detailed information, and kept it engaging at the same time.  I would much rather listen to subject matter experts at a keynote than marketing teams.

I also thoroughly enjoyed Rob Farley and Buck Woody's song "I should've looked the other way":

After the awesome keynote, I hit Brad McGehee's session "Inside the SQL Server Transaction Log".  The session was fascinating and I brought more notes back for further research.  Brad (Director of DBA Education @ RedGate - Blog - @BradMcGehee on Twitter) underwent quite the transformation after the PASS summit pursuant to RedGate's announcement about sending a DBA into space.  Congrats to Joe Miller (@gajarga on Twitter) on winning the DBA in Space contest!

I grabbed a quick lunch and then made my way to the PASS Board of Directors Meet and Greet.  It was a great time of Q&A and a few complaints (at least one of which was just a bit misguided), but overall it was an awesome chance to meet the PASS BoD and gain insight into their vision for the organization.  I left more excited about PASS as a whole and had more of a desire to contribute to the SQL Server community.

The afternoon brought me to a session by John Sterrett (Blog) on utilizing Policy Based Management and Central Management Server to oversee a large SQL Server infrastructure.  John posted some of the feedback comments on his blog (he even posted one of my comments, and called me out for my honesty).  I found the session very interesting, but through no fault of John, my brain shut down about 3/4 of the way through.  I had just had enough for one week.  John obviously knew his stuff and I still have tasks to follow up on back at the office.

With the condition of my brain, I decided to skip the last session of the day (Thomas LaRock's session on managing SQL memory - sorry Tom!).  Instead, I wandered around and ended up talking to a lot of different people.  I talked with Pinal Dave (@PinalDave on Twitter - I read his blog ALL the time), Rick Morelan (founder of Joes2Pros - @RickAMorlean and @Joes2Pros on Twitter - he has more certifications than you can shake a stick at), Brent Ozar, and a host of other people.  

I wrapped up Friday by hitting Ivar's Seafood with my college buddy.
I do so love crab legs!
Final thoughts. 
The PASS Summit is a must attend event for any SQL Server professional.  There just isn't anything that can match PASS's caliber of content, networking, and community.  If you're serious about your work with SQL Server, you need to attend this event.  Be sure to also join your local chapter if you're not already a member.

Also, be sure to pace yourself.  I attended both pre-con sessions, so I had a completely full week.  Be sure to get enough rest to go the distance.

Thank you Kathi Kellenberger for being my big sister/Aunt through the PASS First Timer's program.  And thanks for interviewing me for your Summit First Timer Follow-up blog post.

Many thanks to all the hardworking folks at PASS for a job well done!  I look forward to attending the Summit again next year.

Monday, November 21, 2011

My PASS 2011 Experience - Part 3 - The Meat and Potatos of PASS


My PASS 2011 Experience - Wednesday - now things really start getting interesting.

Registration for PASS was an uber-simple process.  I stepped up to the table, reported my name, and within 60 seconds was walking away with my badge (just in case I forgot my name), ribbons (for the badge), and a backpack full of all sorts of geeky goodies (including some promotional material, a lanyard for the badge, a decent pen, and SQL Server Denali 2012 CTP).  I wish the backpack had a small pocket on the front or side (for smaller items and/or boarding passes), but otherwise I absolutely love it (I use it every day to haul my stuff to and from work).

My only hiccup was the ribbons.  I had three ribbons and I wasn't quite sure just what to do with them.  One ribbon indicated I was a First Timer, and the other two were for each of the pre-con sessions I was attending.  It took me a minute (and a question to someone who had done this before) to realize that the ribbons needed to be stacked vertically at the bottom of the badge.  Once that was figured out, I was on my way.  Little did I realize just how much fun people have with these ribbons.  Rob Farley (Blog - @Rob_Farley on Twitter) must have 20 ribbons hanging off his badge.  Kudos to the vendors that offered ribbons for your badge.  A pretty good range of ribbons were available from a few vendors including ribbons that promoted a specific brand or product to ribbons that were just silly or humorous.  I grabbed one from Quest that said "DBA Daddy" (it was quite fitting considering the fact that my daughter lost her first tooth while I was at PASS).

Wednesday, October 12th. 
I headed down to the hotel restaurant for breakfast just after 7am and ran into Thomas LaRock (Website/Blog - @SQLRockstar on Twitter).  He graciously invited me to join him for breakfast.  With him were Tim Ford (Website/Blog@SQLAgentMan on Twitter) and Andy Leonard (Blog@AndyLeonard on Twitter).  I was excited and humbled to be able to hang out with these giants in the SQL community.  I would quickly realize that this was the norm.  The SQL community is all about just that, community.  If you see someone that you look up to in the SQL community, reach out and say "Hi!"  Chances are, you'll wind up having a meaningful conversation and possibly even share a meal with them.  The conversation took an interesting twist at one point when they mentioned that Brad McGehee (@BradMcGehee on Twitter) had a really sweet announcement he was going to make for RedGate at 10:05am.  I made a mental note to be in the vendor hall @ the RedGate booth at that time to find out what all the excitement was about.

The keynote presentation started off with a bang, but quickly fizzled.  The Microsoft announcement that SQL Server Denali would be officially known as SQL Server 2012 was met with thunderous applause.  The concept of big data on Windows (specifically Hadoop) was pretty cool, but the excitement was quickly extinguished by Microsoft's marketing department's use of Excel and text too small for anyone to be able read.  Relating big data to selling frozen yogurt to High School "kids" didn't help (I don't know of anyone in High School who likes to be called or treated like a "kid").  The excitement of the presenters was not felt by the audience and each subsequent "amazing" announcement was met with ever diminishing rounds of polite golf claps.  However, there was one thing that kept me from falling into a deep coma.  Twitter.  Twitter was ablaze with comments on the presentation.  There were a few unprofessional comments made, but overall the stream during the keynote very humorously pointed out the failings of the presentation.  While everyone around me was nodding off, I had to stifle my laughter.

After the keynote I made my way to the vendor hall to catch the big announcement from RedGate.  No WAY!  They're going to send a DBA into space!  Wow, what a chance of a lifetime.  I have since completed all the questions and submitted my bid to be the first DBA in space.  YAY!

I then made my way to a really interesting session "SQL Server Storage Engine - Under the Hood: How SQL Performs I/O" presented by Thomas Grohser (Blog - @TGrohser on Twitter).  BTW, sorry I was late for your session Tom, I was snagging some of the freeze dried ice cream from Red Gate. Tom's session covered detailed explanation of SQL Server I/O and where to look for bottlenecks (I've used this information multiple times at work since PASS).  I found this session to be very insightful and Tom is an excellent speaker.

After lunch I purchased the SQL Server MVP Deep Dives 2 book and then made my way through the book signing line where all the MVP authors signed it.  It was a really cool experience to be able to meet all the people that poured their hearts out to contribute not only to this book, but to the SQL community.

I blew off the afternoon sessions to spend time in the vendor hall.  I made my way around to most of the vendor booths and signed up for a whole host of different vendor prizes.  I've never won any prizes from the many vendor drawings I've participated in the past, but I'll never win anything if I don't enter the contests in the first place.  I felt like a kid in a candy store in the vendor hall.  I spent a good amount of time in Dell's booth looking at blade servers.  I also spent at least an hour in EMC's booth talking about our SAN and they interviewed me for a video blog (my segment starts @ 3:40).  


At 4:30 pm I left the vendor hall and made my way to Brent Ozar's session: BLITZ! The SQL - More One Hour SQL Server Takeovers.  I've already commented on Brent and his sessions before.  This presentation was no different.  He knows his stuff and is passionate about it.  When work isn't completely insane, I try to regularly attend the free Brent Ozar PLF Technology Tuesday Triage web presentations.  BTW, the PLF is short for the other three wonderful folks in his consulting group.  Jeremiah Peschka, Kendra Little, and Tim Ford round out the ranks @ Brent Ozar PLF.  I met the whole crew at PASS and I gotta say, they're some of the best in the business.

Wednesday evening was the Exhibitor Reception in the vendor hall.  The food was fancy, but about the only thing I really liked was the calamari.  I probably would have liked a cheese tray and crackers better, but the food really didn't matter, I was there to spend time with the vendors.  And I did.  Again, kid in a candy store.

I was on the phone with my wife as I left the convention center after the vendor event.  Suddenly, I saw an old college friend that I hadn't seen in 12 years (he used to work as a programmer right next to my wife in the Data Processing department at college).  We chatted a bit and agreed to catch up the next day.  That was just too cool!

Back at the hotel I decided I was really hungry, so I decided to head over to Sullivan's Steakhouse across Union Street from the hotel.  I sat down and ordered a nice dry aged 14 oz NY Strip steak.  My waiter asked me if I was a Star Trek fan and pointed out someone just down the bar from me.  Once I was able to get a clear view of his face I realized it was Jonathan Frakes (Commander Riker from Star Trek: The Next Generation - @jonathansfrakes on Twitter).  My waiter indicated that he wasn't sure and that he wasn't allowed to ask.  I responded that he might not be allowed to ask, but there was nothing stopping me.  I wandered over, introduced myself and very politely asked him if he was Jonathan Frakes (obviously he was) and said I was a big fan.  He's working on a new SciFi series and was in town to work with the Seattle Symphony as they were playing the score for the series.  I kept my conversation with him brief as I didn't want to pester him.  He was very nice and he even made it a point to ask my name at the end of our conversation.  Very cool!

At this point I was exhausted and headed back to the hotel to crash for the night.  What a day!  And that was only Wednesday!

Next time I hope to wrap up coverage of the conference and provide some follow-up thoughts.

Thursday, November 10, 2011

My PASS 2011 Experience - Part 2 - Pre-con Sessions


My PASS 2011 Experience - the continuing saga.

Pre-conference (pre-con) sessions were available both Monday and Tuesday and I signed up for sessions both days.  These all day (8:30am - 4:30pm), deep-dive sessions, thoroughly immerse you in the subject matter.  I highly recommend twisting the arms of your managers to get them to pay for these additional sessions as they are a fantastic way to really delve into the topics covered.  Personally, I feel that the pre-con sessions held so much value that I got my money's worth (or more specifically my company's money's worth) just from the 2 pre-con sessions alone.  However, there is just too much fun to be had, knowledge to be gained, and people to meet that skipping the main portion conference is just not an option (more on that in my next post).

One recommendation, bring a plug strip with a long cord on it as outlets were at a premium.  I found myself plugging my laptop in on every break (15 minute break mid-morning, 1 hour lunch break, and a 15 minute break mid-afternoon).  This allowed me to have full use of my laptop during the all day sessions (albeit my battery was nearly exhausted by the end of the day).

A beverage recommendation.  When the drinks are available, take advantage of them.  Juices were available at breakfast, water or iced tea were the only drinks available at lunch, and  coffee, tea and soda were available during the mid-morning and mid-afternoon breaks.  If you get pretty thirsty like I do, grab 2 or 3 bottles of juice with breakfast or cans of soda on your breaks so that you have enough to drink throughout the day.

Monday, October 10th.  
I woke early on Monday (still on East Coast time) and had breakfast with Aunt Kathi (Kathi Kellenberger - @AuntKathi on Twitter).  Kathi had interviewed me shortly before PASS 2011 as I participated in the First Timers program (that interview can be found here on the PASS Blog).  The First Timers program was absolutely brilliant!  Pure genius!  Being able to talk to Kathi about what to expect during PASS made it much more enjoyable and I was able to take advantage of many opportunities to learn and network that I would have missed if I were going alone.  I also had a very enjoyable time learning more about her history (she started off as a Pharmacist and moved into SQL - talk about a career leap).

After breakfast I went to my first pre-con session: "Virtualization and SAN Basics for DBAs" with Brent Ozar (www.brentozar.com @BrentO on Twitter).  I have attended some of Brent's sessions in the past, and he is an excellent presenter.  He knows his stuff, and his enthusiasm and excitement are infectious.  I eagerly anticipated attending this session as we are in the process of migrating to and expanding our VMware and SAN infrastructure.  

Brent Ozar displaying his trademark "Jazz Hands" at SQL Saturday in Chicago
Brent does a really cool thing in his sessions; he offers a chat session for attendees to interact and ask questions.  Periodically he reviews the chat session and answers any questions there.  About an hour into the session, I realized that I had more experience with SAN and Virtualization than I initially realized and I started answering questions in chat.  I think there may have only been one question in chat that I was not able to answer for the remainder of the day.  I was also able to verbally ask quite a few questions about our infrastructure and the challenges we're facing.  It felt great to really be able to participate in this session.  My favorite moment in this session was when Brent said, "Folks, Matt Slocum is not a plant."  

After Brent's awesome session, I went to the Networking dinner at Lowell's on the water front.  Andy Warren and Steve Jones coordinated the dinner and it was a great time to meet people as passionate about SQL as I am.  I left there with several new friends and we made our way over to the Tap House to hang out and chat.  It was a great time to meet more SQL geeks like me, many of whom I have seen online and read their blogs.  After that, Aunt Kathi and quite a few of my new friends headed over to Bush Garden for karaoke, but still being on East Coast time I retired back to the hotel to rest up for my next pre-con session.

Tuesday, October 11th.  
I slept a bit later (starting to get used to West Coast time) and headed to breakfast at the Convention Center. Breakfast left something to be desired.  It was a very nice continental breakfast (fruit, cereal, muffins, bagels, bread, etc...), but I was hoping for something more substantial that included eggs, ham or bacon (really wanted the bacon).  I was able to stock up on a couple bottles of juice and head off to Adam Machanic's session "No More Guessing! An Enlightened Approach to Performance Troubleshooting."

Adam (Blog - @AdamMachanic on Twitter) and I had already been tweeting back and forth a bit on Twitter, so it was cool to be able to chat face-to-face before the session started.  Adam knows his stuff and keeps the material engaging.  I gained precious insight into troubleshooting SQL server performance and I have a whole host of things I need to follow up on after this session.  

Steve Jones snapped this pic of Adam's Performance session (I'm the one in front giving a thumbs-up)
Tuesday evening was the Welcome Reception and quizbowl (sponsored by Dell).  Being my first PASS, I had never witnessed the quizbowl before and I found it very interesting (all the participants were awesome and Rob Farley was especially hilarious).  There was some fierce competition and very humorous responses.  I was able to find enough food at the reception to stem my hunger and retired back to the hotel when things started to wind down (still not quite used to the time zone).

Overall, I learned a lot on Monday and Tuesday.  I really enjoyed the deep-dive into the topics, and meeting a lot of new people.  Friendships were forged in those days that will last a lifetime.

Stay tuned as next time I'll begin coverage of the main conference when things really start to get interesting.

Wednesday, November 9, 2011

My PASS 2011 Experience - In a word, "WOW!" - Part 1

Part 1 - Introduction

I attended TechEd 2009 in Los Angeles, CA. I thought LA was a good venue, but there were many sessions that there just wasn't any content for a SQL DBA.  However, there is no shortage of things to do in LA, so I was able to take advantage of some of the local sites/attractions (I also made good use of the labs and vendor hall).

In 2010, I attended TechEd in New Orleans.  Not a lot to do in New Orleans other than hit Burbon Street (I don't drink, so aside from listening to some live Jazz artists, it wasn't my cup of tea), but the SQL content overall was much better.  There were several session times that I had multiple sessions to choose from, but most time slots I had only one session that really appealed to me.

Just before TechEd 2010 in New Orleans, I found out about the PASS Summit and I convinced management to send me to PASS Summit 2011.  Wow!  Just ... WOW!  Every single session time slot was jam packed full of delicious sessions.  Most session times I was doing good to narrow it down to only 3 or 4 sessions that I was interested in.

Overall my PASS experience blew me away and I can't wait to go again next year.

The week before PASS I signed up for a Twitter account (@SlocumMatt).  I thought it might help me while @ PASS.  It totally did and I highly recommend to anyone going to PASS to create a Twitter account (if you don't already have one) so you can more easily connect with the SQL community while @ PASS.

Without further ado, I'd like to present to you, My PASS 2011 Experience (Warning, OutputVerboseLevel = 2).

Sunday, October 9th.  
I got up at 3am ET to catch a 5:30am ET flight (and I was very excited the night before, so I only got about 3 hours of sleep).  I had a brief layover in Philly and arrived in Seattle almost 45 minutes early (10am PT).  After collecting my bag from the baggage return, I headed to the train station (attached to airport, very convenient).  Train from the airport to Westlake was inexpensive ($2.75) and took about 40 minutes.  I walked two blocks to the hotel (Sheraton) and got checked in (nice room on the 31st floor).  

Sheraton Hotel (Convention Center is to the right of the hotel)
By this time it was only 11am, so I had almost an entire day to kill.  I walked over to The Cheesecake Factory and had a nice lunch.  Then I caught a movie (Lion King in 3D).  The restaurant and theater were within 1 block of the hotel, so I didn't have to walk far.  The Convention Center was on the next block to the North of the hotel.
Washington State Contention Center
I crashed in the hotel room for a little while and got settled in, then I walked over to the Convention Center just after 5pm and registered for PASS.  I wasn't all that hungry (from travelling and being so tired), so I hit a grocery store and picked up some Baked Lays (they really help settle my stomach) and crashed in the hotel.  I watched some TV and got TweetDeck installed.  About 7pm I was going to go hang out with Aunt Kathi (Kathi Kellenberger - @auntkathi on Twitter), but she wasn't going to get to the hotel until about 8pm and I was just about ready to pass out from lack of sleep (and I was still on ET), so we decided to meet for breakfast the next morning.

Stay tuned for scenes from my next episodes (including pre-con sessions and remaining PASS coverage).