Wednesday, July 10, 2013

Maintaining SQL Replication Publications -or- How To Add An Article To A Publication AND Generate A Snapshot

For quite some time, I was stumped as to how to properly script adding an article to an existing publication.  It's a somewhat straightforward process using the sp_addarticle command, but for some reason when I would start the Snapshot Agent, a snapshot would never be generated for the table I added to the publication (and thus it would not be synced to the Subscriber).  

In my case the sp_addarticle command looks like this (the PublicationName and TableName were changed to protect the innocent):
EXEC sp_addarticle @publication N'PublicationName'@article = N'TableName', @source_object = N'TableName', @type = N'logbased', @pre_creation_cmd = N'drop', @destination_table = N'TableName';

Simply running sp_addarticle with the correct options was not enough to allow a snapshot to be generated for that table.  When I started the Snapshot Agent, it would complete successfully with the message [0%] A snapshot was not generated because no subscriptions needed initialization.  Perplexed, I decided to perform the same actions through the GUI (I knew it worked properly there) and then run SQL Server Profiler and filter by my local HostName.  

Note: If you have enabled the @immediate_sync option (it is off by default), you will not be able to drop individual tables/articles from the publication.  However, you can turn it back off without recreating the publication by issuing the following command:
EXEC sp_changepublication @publication N'PublicationName', @property N'immediate_sync'@value = N'FALSE'

In SQL Server Profiler, I saw sp_addarticle executed and I also observed several help SPROCs that the GUI uses for validation, but then the diamond in the rough appeared.  There was an additional SPROC that was executed against the published database: sp_refreshsubscriptions.   Executing sp_refreshsubscriptions will refresh the subscription metadata for any new articles added to the publication, which the Snapshot Agent subsequently uses to determine what articles need a new snapshot.  The syntax is very simple (unlike sp_addarticle):
EXEC sp_refreshsubscriptions @publication = N'PublicationName';

Just replace the PublicationName with the name of the publication to which you have just added article(s) (you can add multiple articles and then execute sp_refreshsubscriptions only once). Once you execute it in the context of the published database, the next time the Snapshot Agent runs, it will generate a snapshot for the article(s).

Note: You do not need to run the sp_refreshsubscriptions SPROC when creating a new Subscription/Publication, as creating the Subscription after the Publication has been created populates all the article Subscription metadata from the Publication.

While I'm on the topic, I also want to quickly touch on how to drop a table from a Publication.  There are two SPROCs to run to remove a table from the publication, sp_dropsubscription and sp_droparticle:
EXEC sp_dropsubscription @publication = N'PublicationName', @article = N'TableName', @subscriber = N'all', @destination_db = N'all';

EXEC sp_droparticle @publication = N'PublicationName', @article = N'TableName', @force_invalidate_snapshot = 1;

Just update the PublicationName and the TableName as appropriate.  

The final tip for today is that you can script starting the Snapshot Agent.  It is another simple SPROC that you run against the published database:
EXEC sp_startpublication_snapshot N'PublicationName';

We use sp_startpublication_snapshot extensively when setting up or modifying a Publication (it sure beats finding the appropriate SQL Agent job on the Distributor, and we don't have to bring up Replication Monitor).

Thursday, May 2, 2013

The Road To Hong Kong - or rather MCSE: SQL Server 2012 Data Platform

While I was at the PASS Summit 2012 (THE best SQL Server conference on the planet) in November, I decided I needed to start working on updating my Microsoft SQL Server certifications.  The task seemed simple enough, but the challenge was more than I had initially anticipated.

I've been a DBA for over 7 years, so while at the PASS Summit, I took the 70-462 exam: Administering Microsoft SQL Server 2012 Databases.  I passed without studying as I've been keeping up on the new features of SQL 2012 and I manage multiple SQL 2008 R2 instances (which is administratively very similar to SQL Server 2012).  Filled with so much elation at passing the first exam, I decided to take the 70-461 exam: Querying Microsoft SQL Server 2012 the very next day.  Alas, I failed that exam (I got a 650), but seeing as I had a 50% off discount, I wasn't too concerned.  I chalked it up to a learning experience.


Fast Forward a couple months (mid-January 2013), and my company was working on renewing our Microsoft Gold Partnership.  This partnership relies on employing people with current Microsoft certifications.  I have my MCITP: Database Administrator on SQL Server 2008 (which also covers SQL Server 2008 R2), so we should be fine, right?  Nope.  Microsoft decided that all of our SQL Server 2008 (and Windows Server 2008) certifications no longer count toward Gold Partnership as they're being deprecated this year.  That seemed somewhat odd since mainstream support for SQL 2008 and SQL 2008 R2 is good until July 2014.  Not to mention the fact that you can also get the new MCSA and MCSE certifications on SQL Server 2008.


What does this mean to me?  I was given a business directive to get my MCSE certification on SQL 2012 by August of this year.  GULP!  Really?  Wow!  {deep breath}  OK.  Let's do this thing!

I decided to take a three prong approach to preparing for the MCSE exams:
  • Microsoft Training classes (there is one class that "corresponds" with each individual exam)
  • Examine the Skills Measures for each exam and review the appropriate Books Online/MSDN Library pages (I spent a lot of time doing this)
  • Play (I have SQL 2012 installed on a couple systems at home and I practiced with any topics I was not intimately familiar with)
I was most worried about the Data Warehouse exam as I have the least amount of experience with that technology.  I took the class and then reviewed all of the Skills Measured for that exam.  Most of them were pretty straightforward, but I needed to look up a lot of the terms to make sure I understood them and in what situations I would use them.

With the return of the MCSA and MCSE, Microsoft has tried to increase the value of the certifications and make sure you know what you are talking about.  In the past, just taking the Microsoft Training classes that relate to the exams have thoroughly covered the exam objectives.  If you took the class, paid attention and took good notes, and then went and took the exam, you would pass.  This is not the case with the current generation of exams and classes.  This is not necessarily a bad thing, in fact, I think it's a good thing.  In order to pass the exam, you will need to review the Skills Measured for each exam and make sure that you are familiar with the concepts, syntax, and sometimes even command line switches.  

What's the best way I found to learn and really drive these points home?  Play!  Set up an instance of SQL Server  on a system somewhere and just start playing with the features and concepts that you need to learn for the exam.  If you can find someone else who is interested in also taking the exams (or at least interested in learning the topics covered on the exams), that can help you tremendously as you need to understand a topic before explaining it to someone else.

Overall, I'm glad that this business requirement came up as it motivated me to buckle down and get 'er done!  I highly recommend to anyone that is passionate about SQL Server to go for this certification.  You'll learn a few things in the process and hold a more valuable certification.


Requirements for the SQL Server 2012 MCSE certification:

Monday, December 10, 2012

My Very First Guest Post


I met someone at the PASS Summit this year who really pushed past her fears to meet a lot of people and make an impact.  Andrea Allred.  As a tribute to her bravery, I have submitted a guest post over on her blog specifically about dealing with your fears and not letting them stand in your way.  Swing over to her blog (www.royalsql.com) and check out her awesome posts.  

My guest post can be found here: http://www.royalsql.com/2012/12/10/fear-is-a-part-of-life-by-matt-slocum/

SQLFamily is awesome!

Wednesday, December 5, 2012

How to Downgrade SQL Server Editions


I recently ran into an issue where the production VM template (containing Enterprise Edition of SQL Server 2008 R2) was used to deploy new SQL Server VMs for Engineering development (and thus should be the Developer Edition of SQL Server 2008 R2).  This is not a supported path by Microsoft, thus a more creative approach needs to be taken in order to accomplish this task. 

In order to complete this task, uninstalling the higher edition of SQL and reinstalling the correct edition is required.  I was able to work out the following process that helped to ease the pain of the downgrade.  I have used this process to downgrade Enterprise Edition to both Developer Edition and Standard Edition, but it should also work for any other lower Edition of SQL Server (Web, Workgroup, etc…)

I started by verifying the integrity (DBCC CHECKDB) and creating fresh backups of all DBs (including system DBs).

I then created scripts to recreate all of the following (in case the process failed at any point):
     ·         Logins (including roles) – contained in Master database
o   I used sp_help_revlogin, which works great for recreating the logins with the existing password, but unfortunately it does not script out the role membership (I scripted that manually).  For those that are curious, the password is hashed in the script, so you can’t use sp_help_revlogin to look up/hack a login password.
      ·         Jobs – contained in MSDB database
o   I had job creation scripts for setting up the server in the first place, but you can select the Jobs folder in the Object Explorer pane in SSMS and then right-click each job.  Just select Script Job as… > CREATE To > New Query Editor Window
     ·         SSIS Packages  – contained in MSDB database
o   Restoring the MSDB database should recover your SSIS Packages, but you may want to export them through Integration Services, just to be certain to preserve your hard work.

Note: The build number must match exactly for this process to succeed.  I was able to apply SP2 for SQL 2008 R2 (SP2 for SQL 2008 R2 results in build 4000 - 10.50.4000) before starting the process to make things easier after the install (then I don't have to track down what Cumulative Updates and/or Security Updates are applied to the current instance of SQL).

Next, I stopped all SQL Server services and created a folder on the Desktop (called System DBs).  I copied in the MDF and LDF files for the Master, MSDB, and Model (as we had customized Model) databases for the Enterprise Edition installation of SQL Server. 

With all the ground work in place, I then uninstalled all MSSQLSERVER features to remove Enterprise Edition (I grabbed a screen shot of the Select Features screen while I was uninstalling so that I could be certain to install the correct features when reinstalling).  Uninstalling the Shared Features is not necessary, and will only make the process take longer. 

I then installed Standard Edition to the same folder as the previous (Enterprise) edition.  I also applied the correct service pack to SQL Server (to bring the build up to the same as the previous Edition).  Installing SQL Server to the same folder will overwrite all existing System DBs, but your User DBs will remain unscathed.

At this point in the process is where I tried to add some awesome sauce. 

Instead of attempting to restore master (which can be a royal pain), model, and msdb; I merely stopped all SQL services.  I then created a new folder on the Desktop (called New System DBs) and moved out the current MDF and LDF files for the Master, Model and MSDB databases.  Then I copied in the MDF and LDF files for the previous installation of the Master, Model and MSDB databases and started the SQL services again. 

This worked perfectly!  I was presented with all my databases, logins (with the appropriate roles), jobs, SSIS packages, etc...

Summary:
1. Verify integrity of all DBs (DBCC CHECKDB)
2. Make backups of all DBs (including system DBs)
3. Make scripts to be able to attach DB and recreate logins (including roles), jobs, SSIS packages, etc... (just in case the overall process breaks)
4. Make note of the current Service Pack for SQL Server and the path(s) that SQL Server is installed to.
5. Stop all SQL Services
6. Make copies of the MDF and LDF files for Master and MSDB (Model too if you've customized it like I have) to System DBs folder
7. Uninstall SQL Server Enterprise Edition (all Features under MSSQLSERVER need to be removed and be sure to make note of the Features you're uninstalling)
8. Install SQL Server Standard Edition (or the Edition you're trying to downgrade to - i.e. Web, Developer, etc...) making sure to install the same features for SQL Server
9. Install the same Service Pack as was previously installed
10. Stop all SQL Services
11. Move current MDF and LDF files for Master and MSDB (Model too if customized) out to New System DBs folder
12. Copy in MDF and LDF files for Master and MSDB (and Model if customized) from previous installation
13. Start SQL Server services

I hope this process helps you if you find yourself in the same situation.

Thursday, November 8, 2012

My PASS Summit 2012 Experience - Day 2 Keynote (LiveBlog)


Thursday is Women In Technology day at the PASS Summit.  This year I was able to wear a kilt  to show support for the fairer gender. 

The keynote started off with an overall PASS status update. Emphasis was given to the community, and how you can do much in the community without huge financial resources at your disposal.


Tomas LaRock stepped up and announced some additional awards being added to the PASS lineup. Jen Stirrup was awarded the PASSion award, and she totally deserves it for her contributions to recognizing Women In Technology.


PASS Summit 2013 was mentioned. It will be held October 15-18 in Charlotte, NC. www.sqlpass.org/summit/2013/


Quentin Clarke (Microsoft Corporate Vice President, Database Systems Group) spoke on data analysis and how businesses can use that data to market their business. This poses new challenges and problems when trying to analyze that data in a more real-time manner. 


A large package shipping company was mentioned, and the analytics that they perform to generate additional revenue by analyzing the contents of their data warehouse. The data life-cycle was discussed and demoed. Effective use of the data requires analysis and collaboration between multiple departments/companies to take full advantage of the advantages that proper analysis can reap.

Analyzing data from a movie theater company's data warehouse was demoed (utilizing PowerView). There was a mixed response as most DBAs and straight up developers were not incredibly impressed or interested. Karen Lopez tweeted that big data might not be interesting, but it is inevitable and that we need to at least be aware of the technology and how to work with it. As a DBA, I'm likely not going to be using Excel very frequently. If the demos could cover more information as to how things work under-the-hood, a lot more interest would be garnished. I think that's why Dr. DeWitt was so insanely popular last year.


The ability to add Azure as an Availability Replica was an awesome feature that was shown off toward the end of the demo.  The demo ended on a high note as more functions and features were demoed.


I'm really looking forward to Dr. DeWitt's session tomorrow. What a brilliant man!


Wednesday, November 7, 2012

My PASS Summit 2012 Experience - Day 1 Keynote (LiveBlog)


Bill Graziano opened the day with a PASS status update and Summit overview. Connect, Share, and Learn is still the mantra of PASS.  Community is of critical importance. No one can know everything, but together we can accomplish anything. Bill recognized many of the community volunteers. It was a great opportunity to realize and acknowledge the value of those that contribute to the community.

A new user group website was announced with additional and more valuable tools. These tools are designed to ease chapter management and administration. As a chapter leader, this excites me greatly.

A PASS BI conference in Chicago (at the Sheraton Chicago Hotel and Towers) was announced, April 10-12, 2013. The BI arena is growing rapidly and a dedicated conference is awesome.

It's an exciting time to be involved in PASS. PASS is growing (over 127,000 members) and is becoming much more globally focused. International growth has been tremendous. 57 countries are represented and additional growth opportunities are actively being developed.

Ted Summers (Corporate VP of Microsoft) presented the keynote. Microsoft is extremely excited about the release of SQL Server 2012. The release of SP1 for SQL Server 2012 was announced and was released today. I'm very excited about that as it addresses a performance issue that I had when testing SQL Server 2012.

Big/non-relational data (including, but not limited to, Hadoop) is still a big focus for Microsoft. Data storage requirements are only going to continue to increase in the future, so dealing with larger and larger amounts of data will continue to garnish attention.

In-memory technology was discussed and it was announced will be a feature in the next major version of SQL Server (codename: Hekaton). The technology was demoed and you can actually pin an entire table (including indexes and table structures) into RAM (the demo initially yielded a 5x gain in performance). You can also recompile SPROCs to run native in memory (in combination with in-memory configured for the table, this yielded nearly a 30x gain in performance in the demo).  Included is a tool that examines performance and recommends tables and SPROCs for in memory optimization. No code or hardware changes were needed to reap huge benefits.

Performance gains of an in memory column store index was also discussed.
Heketon makes SQL incredibly fast ("wicked fast") and since it's merely an enhancement to SQL Server, if you know SQL Server, you know Heketon.

Additional optimizations to the parallel data warehouse have yielded massive performance improvements in that arena as well.

An updated query processor was announced. PolyBase handles queries from multiple sources, including local and cloud sources.  This allows you to use standard T-SQL code to query Hadoop and other non-relational databases. You can even JOIN between relational and non-relational data. This will also be included in the next major version of SQL.

BI enhancements were announced. Excel 2013 is now a complete BI tool. PowerView has been integrated into Excel to provide exceptional BI functionality. Full interactive maps were demoed directly within Excel.  

Overall, the keynote on Wednesday was very exciting for the SQL Server professional.  I eagerly anticipate the next major release of SQL Server.

Wednesday, August 29, 2012

SQL Server Service crash after Service Pack Install

A tough lesson regarding why you should always keep the configured default Data and Log paths updated in SQL Server.

At my company, our virtual infrastructure is growing and changing rapidly.  As a result the configuration of our SQL Servers is changing also.  This results in additional drives and drive letter changes for our SQL servers.

Unfortunately, if the default Data and Log paths are not updated when the drive letters change, there will be errors after installing (or uninstalling) a Service Pack or Cumulative Update.  The GUI portion of the install will complete successfully, but there are updates that occur on the initial service start after the GUI portion of the install is complete.  One update that occurs that that initial service start is database upgrades.  If your default paths for data and log files are not correct (in my case the drives referenced no longer existed), the upgrade of the master database will fail and the SQL Server service will crash.  

In a production setting, this can induce fear, panic, and mass hysteria (or at the very least mass Google searching, which is likely how you arrived at this page to begin with).  However, the solution is pretty simple as the default paths are stored in the Windows registry.

Launch Registry Editor (regedit.exe) and navigate to the following section of the registry tree: 
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\{SQL Version}\MSSQLServer

Note: If you are running a 32-bit version of SQL Server on a 64-bit OS, you'll need to navigate to the Wow6432Node tree (HKEY_LOCAL_MACHINE\Software\Wow6432Node\Microsoft\Microsoft SQL Server\{SQL Version}\MSSQLServer)

SQL 2000 {SQL Version} = MSSQLServer
SQL 2005 {SQL Version} = MSSQL.1
SQL 2008 {SQL Version} MSSQL10.MSSQLSERVER
SQL 2008 R2 {SQL Version} = MSSQL10_50.MSSQLSERVER
SQL 2012 {SQL Version} = MSSQL11.MSSQLSERVER

Then update the paths specified in the following keys:
DefaultData
DefaultLogs

Once the paths are correct, the master database upgrade will complete successfully (as SQL Server will be able to create the temporary files in the default paths) and once all other system and user DBs are upgraded you should be back up and running.  

Note: The database upgrades will take some time if you have a lot of user DBs, but you can monitor the ERRORLOG to observe the progress.

Here is the appropriate snippet from my log for the failure on the data file:
2012-08-29 08:45:43.39 spid7s      Error: 5133, Severity: 16, State: 1.
2012-08-29 08:45:43.39 spid7s      Directory lookup for the file "G:\MSSQL\Data\temp_MS_AgentSigningCertificate_database.mdf" failed with the operating system error 3(The system cannot find the path specified.).
2012-08-29 08:45:43.39 spid7s      Error: 1802, Severity: 16, State: 1.
2012-08-29 08:45:43.39 spid7s      CREATE DATABASE failed. Some file names listed could not be created. Check related errors.
2012-08-29 08:45:43.39 spid7s      Error: 912, Severity: 21, State: 2.
2012-08-29 08:45:43.39 spid7s      Script level upgrade for database 'master' failed because upgrade step 'sqlagent100_msdb_upgrade.sql' encountered error 598, state 1, severity 25. This is a serious error condition which might interfere with regular operation and the database will be taken offline. If the error happened during upgrade of the 'master' database, it will prevent the entire SQL Server instance from starting. Examine the previous errorlog entries for errors, take the appropriate corrective actions and re-start the database so that the script upgrade steps run to completion.
2012-08-29 08:45:43.39 spid7s      Error: 3417, Severity: 21, State: 3.
2012-08-29 08:45:43.39 spid7s      Cannot recover the master database. SQL Server is unable to run. Restore master from a full backup, repair it, or rebuild it. For more information about how to rebuild the master database, see SQL Server Books Online.
2012-08-29 08:45:43.39 spid7s      SQL Trace was stopped due to server shutdown. Trace ID = '1'. This is an informational message only; no user action is required.

You'll see a similar error if the log folder is not set correctly:
2012-08-29 08:57:50.73 spid7s      CREATE FILE encountered operating system error 3(The system cannot find the path specified.) while attempting to open or create the physical file 'F:\MSSQL\Logs\temp_MS_AgentSigningCertificate_database_log.LDF'.
2012-08-29 08:57:50.93 spid7s      Error: 5123, Severity: 16, State: 1.
2012-08-29 08:57:50.93 spid7s      CREATE FILE encountered operating system error 3(The system cannot find the path specified.) while attempting to open or create the physical file 'F:\MSSQL\Logs\temp_MS_AgentSigningCertificate_database_log.LDF'.
2012-08-29 08:57:50.93 spid7s      Error: 1802, Severity: 16, State: 4.
2012-08-29 08:57:50.93 spid7s      CREATE DATABASE failed. Some file names listed could not be created. Check related errors.
2012-08-29 08:57:50.93 spid7s      Error: 912, Severity: 21, State: 2.
2012-08-29 08:57:50.93 spid7s      Script level upgrade for database 'master' failed because upgrade step 'sqlagent100_msdb_upgrade.sql' encountered error 598, state 1, severity 25. This is a serious error condition which might interfere with regular operation and the database will be taken offline. If the error happened during upgrade of the 'master' database, it will prevent the entire SQL Server instance from starting. Examine the previous errorlog entries for errors, take the appropriate corrective actions and re-start the database so that the script upgrade steps run to completion.
2012-08-29 08:57:50.93 spid7s      Error: 3417, Severity: 21, State: 3.
2012-08-29 08:57:50.93 spid7s      Cannot recover the master database. SQL Server is unable to run. Restore master from a full backup, repair it, or rebuild it. For more information about how to rebuild the master database, see SQL Server Books Online.
2012-08-29 08:57:50.93 spid7s      SQL Trace was stopped due to server shutdown. Trace ID = '1'. This is an informational message only; no user action is required.