Personal Investments

I am a little late in the certification arena but I have a good excuse… my dog ate my ambition! Actually I have been plugging away at my Computer Information Systems (CIS) degree with a concentration of Database Management and it boils down to a few remaining classes. Seems like I have been working towards this degree for a lifetime. I would have been finished long ago, but I had to decide if I would: 1) either run full steam ahead with my education while working full-time in a travelling required position and let my grades suffer or 2) attend part-time and maintain a 3.5 or higher GPA. I choose the latter. Even though some have told me that a (D) will still get you a degree, that is just not my character. I believe if you are going to commit, then you need to commit 100%. Especially when the investment comes with a hefty price tag. Not to mention “half-ass-ism” can be habit-forming and that quite frankly is something I am not looking to catch.

During these next few classes I have decided to pursue DBA specific certifications. I mean I am certified but the certifications that I currently hold are desktop support centric and since my role has changed it is only natural that I change my focus. To be honest when I first started in the IT field I really wanted nothing to do with databases. I was set on being a Server or Network Admin and thought databases were boring. Looking back I realize I was foolish because now I am consumed in it and wish to do nothing more but DBA work. I literally started working with SQL Server 2000 by an initiative that was put into action by my previous employer. The overall goal was to eliminate the use of paperwork for field staff. So the consultants were brought in and shortly after careful observation a project was born. Instead of equipping the field staff with paperwork to shuffle they were issued laptops with a custom developed application and a local copy of sql server (MSDE) back then. The fact we implemented Merge Replication somewhat enticed me to refer to the field staff as subscribers rather than field staff because I was trying to immerse myself with SQL Server terminology as much as I could. Not to mention I felt cool just speaking it out loud during meetings since this was a new concept to our organization.

I have heard and read about the pros and cons about certifications, but I see it as another item to prove that I can take my own initiatives along with my willingness to continue and further my knowledge and skill-set. Another available avenue for continual education. I have also heard certifications combined with a degree tends to hold more weight over those with just a certification but that is debatable. In the end I suppose it is all up to you and how you feel about it. Some say it boosts personal self-confidence and I am all for that. Why not?

The first exam I might tackle is (70-431: Microsoft SQL Server 2005 Implementation and Maintenance). I know it is a SQL Server 2005 exam, but I am more familiar with 2005 at the present time over 2008. I picked up the Microsoft Press Self-Paced book and will use that as my primary study material. I have heard and read many good things about this title. I also thought about (70-445: Business Intelligence Development and Maintenance). I cannot decide actually because I have a love for both. I see a need for Administration and Business Intelligence development. Either way I figured I would set study time to one hour a day at minimum with weekends being optional. If I can or cannot cover multiple chapters in one hour then so be it. I need to be strict with a schedule in order to stick with it. Luckily I have a license for VMware Workstation 7 which allows me to build a VM study system that I can literally destroy and revert when needed. This will provide me an adequate study platform during my progress. One thing I need to do is schedule myself for the actual test as it will make the process more official since I will be actually setting a realistic and attainable goal. Not to mention it will keep me focused since the added penalty of wasting money will add fire to my priorities and will align my studies.

I am interested in knowing what you would pursue first? SQL Server Administration or Business Intelligence?

Installing SQL Server 2005 Express via Command Line

There are a few ways to install SQL Server 2005: 1) From the UI or 2) Command-Line. The UI installation is cool, but deploying via command-line is (in my opinion) better, faster and more importantly consistent depending if you have all parameters specified in a batch file. There are technically two command-line options: 1) Pure Command-Line or 2) Command-Line with answer file.

pure command line

@echo off
color 17
Title SQL Server 2005 Express Unattended Install . . .
setup.exe /qb INSTANCENAME=DEV ADDLOCAL=SQL_Engine,SQL_Data_Files,SQL_Replication SECURITYMODE=SQL SAPWD=StrongPasswordGoesHere DISABLENETWORKPROTOCOLS=0 

command line with answer file

@echo off
color 17
Title SQL Server 2005 Express Unattended Install . . .
setup.exe /qb /settings "AnswerFile.ini"

answer file contents

[options]
INSTANCENAME=DEV
ADDLOCAL=SQL_Engine,SQL_Data_Files,SQL_Replication 
SECURITYMODE=SQL 
SAPWD=StrongPasswordGoesHere
DISABLENETWORKPROTOCOLS=0 

In both of the above examples notice that I am not specifying installation paths, data file paths, etc… So for any parameter that I did not supply a value for will automatically use their default values for each option not implicitly set. I only needed the basic necessities listed for my subscriber deployments.

For more information about additional parameters visit: How to: Install SQL Server 2005 from the Command Prompt

Concatenation Fields

In some instances you are required to piece together data by slapping fields together. This is probably most common with names, addresses and such. Using the AdventureWorksLT database I will illustrate how to concatenate a few fields to comprise a single full name field. First we need to analyze the data to see what we have.

SELECT FirstName
       ,MiddleName
       ,LastName
       ,Suffix
FROM [AdventureWorksLT].[SalesLT].[Customer]

Results
01.SampleData

As you can see we have rows that contain only a first and last name, some that have a middle initial and some have a suffix. Unfortunately there was not a record that had a first, last and suffix so I modified record ID 12 because I wanted to touch base on all scenarios. I find it helpful.

Based on the data we don’t need to worry about the first and last name fields but we do need to be concerned with the middle initial and suffix since some are populated while others are not. So how are we going to handle this? Well there are a few ways but I will show you how to address this with SELECT CASE.

SELECT Firstname + space(1) +
       CASE
          WHEN MiddleName IS NULL THEN LastName
          ELSE MiddleName + space(1) + LastName
       END +
       CASE
          WHEN Suffix IS NOT NULL THEN space(1) + Suffix
          ELSE space(0)
       END 'FullName'
FROM [AdventureWorksLT].[SalesLT].[Customer]

Well let’s review before we move on to the results… line numbers 3 & 4 basically state that when you find a NULL value for the MiddleName just display the LastName instead otherwise display the MiddleName add a space then display the LastName.

Lines 7 & 8 is similar. When the Suffix field contains a value other than NULL add a space and display the Suffix otherwise show nothing.

Results
02.FieldsConcat

Now let’s put everything together to see how it all looks.

SELECT FirstName
       ,[MiddleName]
       ,[LastName]
       ,[Suffix]
       ,FirstName + space(1) +
       CASE
          WHEN MiddleName IS NULL THEN LastName
          ELSE MiddleName + space(1) + LastName
       END +
       CASE
          WHEN Suffix IS NOT NULL THEN space(1) + Suffix
          ELSE space(0)
       END 'FullName'
FROM [AdventureWorksLT].[SalesLT].[Customer]

Note:

The fact this dataset has explicit NULL values as opposed to blank/empty values for the MiddleName and Suffix columns made this easy. If it were the latter of the two the results would differ and require additional measures.

Results
03.ConcatFields

I modified CustomerID 20 by replacing NULL in the MiddleName field with a single space and CustomerID 22 by replacing NULL in the Suffix with a single space. Of course the MiddleName is more apparent as it spaces out the FirstName and LastName more than it should and the Suffix is less obvious but it does add a space after the LastName.

Results
04.NonNULLValues

I was able to address these by adding a two additional WHEN statements to the existing CASE statements.

SELECT FirstName
       ,[MiddleName]
       ,[LastName]
       ,[Suffix]
       ,FirstName + space(1) +
       CASE
          WHEN MiddleName IS NULL THEN LastName
          WHEN MiddleName = space(1) THEN LastName
          ELSE MiddleName + space(1) + LastName
       END +
       CASE
          WHEN Suffix = space(1) THEN space(0)
          WHEN Suffix IS NOT NULL THEN space(1) + Suffix
          ELSE ''
       END 'FullName'
FROM [AdventureWorksLT].[SalesLT].[Customer]

Results
05.NonNULLFixed

Another issue:

What if there were leading or trailing spaces in the FirstName and LastName fields? I modified Record ID 1 and added a 10 space before and after Orlando.

Results
06.Whitespace-FirstnameColumn

Well in this case I would keep the same syntax but would encase the fields within: RTRIM(LTRIM(FirstName)) respectively as it removes the whitespaces before and after.

SELECT FirstName
       ,[MiddleName]
       ,[LastName]
       ,[Suffix]
       ,LTRIM(RTRIM(FirstName)) + space(1) +
       CASE
          WHEN MiddleName IS NULL THEN LastName
          WHEN MiddleName = space(1) THEN LastName
          ELSE MiddleName + space(1) + LastName
       END +
       CASE
          WHEN Suffix = space(1) THEN space(0)
          WHEN Suffix IS NOT NULL THEN space(1) + Suffix
          ELSE ''
       END 'FullName'
FROM [AdventureWorksLT].[SalesLT].[Customer]

Results
07.Trimmed-Firstname

Formating Date and Time

We synchronize GIS data differentially using a proprietary method. As part of the process a date-time parameter is passed and the system updates the GIS accordingly given the time stamp value. Since this is a differential sync only the changes that occurred within the range are applied to the subscriber. A specific format is required but none of the following formats in the example below meet the criteria exactly.

Format 101 addresses the needed date style (MM/DD/YYYY) but there really is not a time format that fits my exact needs. This is where I improvise a bit. Since I pull values from a table that have a “datetime” data type I will break the date and time into two pieces and address each as needed. In this case I only need to worry about the time.

SELECT CONVERT(varchar, GETDATE(), 100) 'Mon DD YYYY hh:miAM'
	  ,CONVERT(varchar, GETDATE(), 101) 'MM/DD/YYYY'
	  ,CONVERT(varchar, GETDATE(), 102) 'YYYY.MM.DD'
	  ,CONVERT(varchar, GETDATE(), 103) 'DD/MM/YYYY'
	  ,CONVERT(varchar, GETDATE(), 104) 'DD.MM.YYYY'
	  ,CONVERT(varchar, GETDATE(), 105) 'DD-MM-YYYY'
	  ,CONVERT(varchar, GETDATE(), 106) 'DD Mon YYYY'
	  ,CONVERT(varchar, GETDATE(), 107) 'Mon DD, YYYY'
	  ,CONVERT(varchar, GETDATE(), 108) 'hh:mi:ss' -- 24HR
	  ,CONVERT(varchar, GETDATE(), 109) 'Mon DD YYYY hh:mi:ss:mmmAM'
	  ,CONVERT(varchar, GETDATE(), 110) 'MM-DD-YYYY'
	  ,CONVERT(varchar, GETDATE(), 111) 'YYYY/MM/DD'
	  ,CONVERT(varchar, GETDATE(), 112) 'YYYYMMDD'
	  ,CONVERT(varchar, GETDATE(), 113) 'DD Mon YYYY hh:mi:ss:mmm' --24HR
	  ,CONVERT(varchar, GETDATE(), 114) 'hh:mi:ss.mmm' -- 24HR
	  ,CONVERT(varchar, GETDATE(), 120) 'YYYY-MM-DD hh:mi:ss' --24HR
	  ,CONVERT(varchar, GETDATE(), 121) 'YYYY-MM-DD hh:mi:ss.mmm' --24HR
	  ,CONVERT(varchar, GETDATE(), 126) 'YYYY-MM-DDThh:mi:ss.mmm'
	  ,CONVERT(varchar, GETDATE(), 127) 'YYYY-MM-DDThh:mi:ss.mmmZ'
	  ,CONVERT(varchar, GETDATE(), 130) 'DD Mon YYYY hh:mi:ss:mmmAM'
	  ,CONVERT(varchar, GETDATE(), 131) 'DD/MM/YY hh:mi:ss:mmmAM'

At first I spotted format 108 which provided hh:mi:ss but it lacked the AM/PM designation but appending it would be simple. So here is what I did.

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,101) 'Date Only'
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35)

Note: I had to update AddressID 35 for illustration purposes.

Outcome
FirstAttempt

The end results are close but not exactly there yet. The parameter only accepts time in a 12 hour format not 24. So I turned to my old scripts to see if I have done this before and I hadn’t but I did use DATEPART in one of them. After jumping into BOL and reading up on DATEPART I found that I could break apart the time into smaller segments (hours, minutes, seconds, etc…) and format them individually. It would require more syntax but the end results would be exactly what I needed.

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,101) 'Date Only'
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
	   ,DATEPART(HH,ModifiedDate) 'HH'
	   ,DATEPART(MI,ModifiedDate) 'MI'
	   ,DATEPART(SS,ModifiedDate) 'SS'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35)

Outcome
Attempt2

Now the problem is that the MI column only displays a single digit when the value is less than 10. This would be the same behavior for the HH and SS columns. So I modified AddressID 36 to illustrate this as well.

Outcome
Attempt3

This means I need to do two things: 1) change the time to return in a 12 hr format and 2) pad the HH, MI & SS columns with a leading zero. I’ll tackle item 2) first by adding a space left of number…

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,101) 'Date Only'
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
	   ,str(DATEPART(HH,ModifiedDate),2) 'HH'
	   ,str(DATEPART(MI,ModifiedDate),2) 'MI'
	   ,str(DATEPART(SS,ModifiedDate),2) 'SS'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35,36)

Outcome
Attempt4

Now I need to replace the space with a zero…

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,101) 'Date Only'
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
	   ,REPLACE(str(DATEPART(HH,ModifiedDate),2),SPACE(1),0) 'HH'
	   ,REPLACE(str(DATEPART(MI,ModifiedDate),2),SPACE(1),0) 'MI'
	   ,REPLACE(str(DATEPART(SS,ModifiedDate),2),SPACE(1),0) 'SS'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35,36)

Outcome
Attempt5

Now we are looking better, but I still need to convert the 24 hr time to 12 hr time. So I’ll take the HH column and subtract 12 from it for any value greater than 12. If it is less than 12 then display as normal.

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,101) 'Date Only'
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
	   ,CASE 
	      WHEN (DATEPART(HH,ModifiedDate) > 12)
	      THEN REPLACE(str(DATEPART(HH,ModifiedDate)-12,2),SPACE(1),0)
	      ELSE REPLACE(str(DATEPART(HH,ModifiedDate),2),SPACE(1),0)
	   END AS 'HH'
	   ,REPLACE(str(DATEPART(MI,ModifiedDate),2),SPACE(1),0) 'MI'
	   ,REPLACE(str(DATEPART(SS,ModifiedDate),2),SPACE(1),0) 'SS'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35,36)

Outcome
Attempt6

Zinga! Now the only thing left to do is to concatenate the Date Only field with the HH, MI and SS fields and append the AM/PM designation.

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
	   CASE 
	      WHEN (DATEPART(HH,ModifiedDate) > 12)
	      THEN REPLACE(str(DATEPART(HH,ModifiedDate)-12,2),SPACE(1),0)
	      ELSE REPLACE(str(DATEPART(HH,ModifiedDate),2),SPACE(1),0)
	   END + ':' +
	   REPLACE(str(DATEPART(MI,ModifiedDate),2),SPACE(1),0) + ':' +
	   REPLACE(str(DATEPART(SS,ModifiedDate),2),SPACE(1),0) +
	   CASE 
	      WHEN (DATEPART(HH,ModifiedDate) < 12)
	      THEN ' AM'
	      ELSE ' PM'
	   END AS 'Usable Date'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35,36)

Final Outcome
FinalAttempt

Using xp_cmdshell to pass parameters to vbs

For an existing client running SQL Server 2005 Standard there was a need to take values from a result set and pass them to a vbs that resided on the server. In my post “Formating Date and Time” I ran through the steps of formatting a datetime data type to meet my needs.

I chose to pursue a pure tsql based option but there are other avenues which I could have followed such as using a pure VBS approach recommended by Robert Davis (blog | @sqlsoldier) which I have done in the past and worked quite well. Not sure why it didn’t come to mind as it is a viable solution first. Then Dave Levy (blog | @Dave_Levy) suggested powerShell. Which is something I actually haven’t worked with yet. Interesting as #powerShell sounds I had to produce something ASAP, but in the process I would research #powerShell nonetheless.

So I chose a TSQL path using the extended stored procedure xp_cmdshell to accomplish the task at hand. To get started xp_cmdshell needs to be enabled, but first you might want to turn on advanced options to see if its is already on.

TSQL

EXECUTE SP_CONFIGURE

Outcome
StandardOptions

As you can see the results set are limited so in order to see information about xp_cmdshell we need to switch on the advanced options… so let’s get to it.

EXECUTE SP_CONFIGURE 'show advanced options', 1
RECONFIGURE WITH OVERRIDE
GO
EXECUTE SP_CONFIGURE
GO

Being in alphabetic order you’ll need to scroll all the way down to see if the config_value is set to one. If it is set to zero then it’s disabled. In that event we need to enable it.

Outcome
02.AdvOptions_v0.01

That is done by setting the value to one. This is simply done by running the following:

EXECUTE SP_CONFIGURE 'xp_cmdshell', '1'
RECONFIGURE WITH OVERRIDE
GO
EXECUTE SP_CONFIGURE
GO

Now that xp_cmdshell is enabled we will see a value of one in the config_value field.

Outcome
03.xp_cmdshell_enabled

To test it let’s run the following:

DECLARE @results INT
DECLARE @cmd nvarchar(255)
SET @cmd = 'ping sqlsamson8' + ' -n 1 | find /i "reply"'

EXEC @results = master..xp_cmdshell @cmd
IF (@results = 0)
	SELECT 'ONLINE'
ELSE
	SELECT 'OFFLINE'

I don’t have a system called sqlsamson8 so ‘OFFLINE’ is returned.

Outcome
04.TestRun

Now let’s test this with a system that does exist.

DECLARE @results INT
DECLARE @cmd nvarchar(255)
SET @cmd = 'ping sqlsam7' + ' -n 1 | find /i "reply"'

EXEC @results = master..xp_cmdshell @cmd
IF (@results = 0)
	SELECT 'ONLINE'
ELSE
	SELECT 'OFFLINE'

Sure enough it’s ‘ONLINE’.

Outcome
05.TestRun

This is cool but I need some form of log to be generated. In short a file needs to be created and written to if the device was found on or offline.

DECLARE @results INT, @onLog NVARCHAR(50),@offLog NVARCHAR(50)
DECLARE @cmd NVARCHAR(255), @mn NVARCHAR(50), @cmd1 NVARCHAR(255)

-- Output results to file
SET @onLog = 'D:\xp_cmdshell\Online.txt'
SET @offLog = 'D:\xp_cmdshell\Offline.txt'
-- set target computer name
SET @mn = 'sqlsamson8'
-- set print process
SET @cmd = 'ping ' + @mn + ' -n 1 | find /i "reply"'
-- Execute ping process
EXEC @results = master..xp_cmdshell @cmd
IF (@results = 0)
	BEGIN
		-- Display online message to grid results
		SELECT @mn + ' IS ONLINE!'
		-- If device is online log computer name to output file
		SET @cmd1 =  'echo ' + @mn + ' >> ' + @onLog
		-- Execute statement
		EXEC master..xp_cmdshell @cmd1
	END
ELSE
	BEGIN
		-- Display offline message to grid results
		SELECT @mn + ' IS OFFLINE'
		-- If device is offline log computer name to output file
		SET @cmd1 =  'echo ' + @mn + ' >> ' + @offLog
		-- Execute statement
		EXEC master..xp_cmdshell @cmd1
	END	

Outcome
06.TestRun_WithOutputLog

Perfect. Exactly what I needed. However I need to run through a series of devices and having to manually specify a computer name is out of the question. I could use a CURSOR, but today I feel in a WHILE loop kind of mode…

DECLARE @Cnt INT, @mn NVARCHAR(50), @newmn NVARCHAR(50)
DECLARE @cmd NVARCHAR(255), @results INT, @cmd1 NVARCHAR(255)
DECLARE @offLog NVARCHAR(50), @onLog NVARCHAR(50)

SET @onLog = 'D:\xp_cmdshell\ONLINE.txt'
SET @offLog = 'D:\xp_cmdshell\OFFLINE.txt'
SET @mn = 'sqlsam'
SET @Cnt = 0

WHILE @Cnt < 10
BEGIN
	SET @Cnt = @Cnt + 1
	-- take @mn and append @Cnt (i.e. sqlsam + 1 = sqlsam1)
	SET @newmn = @mn + CONVERT(VARCHAR,@Cnt)
	SET @cmd = 'ping ' + @newmn + ' -n 1 | find /i "reply"'
	EXEC @results = master..xp_cmdshell @cmd
		IF (@results = 0)
			BEGIN
				-- Display online message to grid results
				PRINT @newmn + ' IS ONLINE!'
				-- If device is online log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @onLog
				-- Execute statement
				EXEC master..xp_cmdshell  @cmd1
			END
		ELSE
			BEGIN
				-- Display offline message to grid results
				PRINT @newmn + ' IS OFFLINE'
				-- If device is offline log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @offLog
				-- Execute statement
				EXEC master..xp_cmdshell @cmd1
			END
END

Great… loop and ping though the devices and record to logs accordingly. Bingo!

Outcome
07.TestRunLoop_WithOutputLog

Well there is one problem… if I run this again the results will be appended to the existing log essentially doubling the results with the same data. This is completely avoidable if I append a date-time stamp to the log file names!

DECLARE @Cnt INT, @mn NVARCHAR(50), @newmn NVARCHAR(50)
DECLARE @cmd NVARCHAR(255), @results INT, @cmd1 NVARCHAR(255)
DECLARE @offLog NVARCHAR(50), @onLog NVARCHAR(50),@dtStamp NVARCHAR(50)

SET @dtStamp =	CONVERT(varchar, GETDATE(), 112) + '_' + 
								REPLACE(CONVERT(varchar(5), GETDATE(), 108),':','')

SET @onLog = 'D:\xp_cmdshell\ONLINE_' + @dtStamp +'.txt'
SET @offLog = 'D:\xp_cmdshell\OFFLINE_' + @dtStamp +'.txt'

SET @mn = 'sqlsam'
SET @Cnt = 0

WHILE @Cnt < 10
BEGIN
	SET @Cnt = @Cnt + 1
	-- take @mn and append @Cnt (i.e. sqlsam + 1 = sqlsam1)
	SET @newmn = @mn + CONVERT(VARCHAR,@Cnt)
	SET @cmd = 'ping ' + @newmn + ' -n 1 | find /i "reply"'
	EXEC @results = master..xp_cmdshell @cmd
		IF (@results = 0)
			BEGIN
				-- Display online message to grid results
				PRINT @newmn + ' IS ONLINE!'
				-- If device is online log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @onLog
				-- Execute statement
				EXEC master..xp_cmdshell  @cmd1
			END
		ELSE
			BEGIN
				-- Display offline message to grid results
				PRINT @newmn + ' IS OFFLINE'
				-- If device is offline log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @offLog
				-- Execute statement
				EXEC master..xp_cmdshell @cmd1
			END
END

Outcome
08.TestRunLoop_WithOutputLog_TimeStamp

Now I only need to add the VBS path, execute it and then we are golden!

DECLARE @Cnt INT, @mn NVARCHAR(50), @newmn NVARCHAR(50), @vbscmd NVARCHAR(50)
DECLARE @cmd NVARCHAR(255), @results INT, @cmd1 NVARCHAR(255)
DECLARE @offLog NVARCHAR(50), @onLog NVARCHAR(50),@dtStamp NVARCHAR(50)

SET @dtStamp =	CONVERT(varchar, GETDATE(), 112) + '_' + 
								REPLACE(CONVERT(varchar(5), GETDATE(), 108),':','')

SET @onLog =  'D:\xp_cmdshell\ONLINE_' + @dtStamp +'.txt'
SET @offLog =  'D:\xp_cmdshell\OFFLINE_' + @dtStamp +'.txt'


SET @mn = 'sqlsamson'
SET @Cnt = 0

WHILE @Cnt < 10
BEGIN
	SET @Cnt = @Cnt + 1
	-- take @mn and append @Cnt (i.e. sqlsam + 1 = sqlsam1)
	SET @newmn = @mn + CONVERT(VARCHAR,@Cnt)
	SET @cmd = 'ping ' + @newmn + ' -n 1 | find /i "reply"'
	EXEC @results = master..xp_cmdshell @cmd
		IF (@results = 0)
			BEGIN
			SET @vbscmd = 'cscript /nologo D:\test.vbs ' + '"' + @newmn + '"'
				-- Display online message to grid results
				PRINT @newmn + ' IS ONLINE!'
				-- If device is online log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @onLog
				-- Execute write statement
				EXEC master..xp_cmdshell @cmd1
				-- Execute vbs statement
				PRINT @vbscmd
				EXEC master..xp_cmdshell @vbscmd
			END
		ELSE
			BEGIN
				-- Display offline message to grid results
				PRINT @newmn + ' IS OFFLINE'
				-- If device is offline log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @offLog
				-- Execute statement
				EXEC master..xp_cmdshell @cmd1
			END
END

Outcome
09.ParameterPassedToVBS

Of course this only works when the naming convention is standardized and the numbers are sequentially sound. I could create a temp table to pull in all the device names and add a ID field that auto increments by a value of one. Then run through the loop selecting the data from temp table where ID = @Cnt. That is a feasible option.

Replication Process Order

Recently during a deployment I ran through a replication setup rehearsal in order to practice my processes. It is always good practice to run through an upgrade rehearsal and document your steps, experiences and results. To give you a brief background in all of our deployments we use merge replication since we send and receive data between our server and mobile subscribers.

As I ran through the article selection process I selected the appropriate tables, indicated their sync direction (download only or bi-directional), followed by the stored procedures, views and ended with the functions as usual. We didn’t have an indexed views otherwise they would have went before the functions. Little did I realize that a view would so happen to reference two other views, meaning that in order for the initialization to complete successfully the two referenced views would need to be created first. The rule “you cannot put the carriage before the horse” definitely applies here.

So to clarify here is a simple scenario: you have a table article and a view which directly references the specific table… simple enough. Well you cannot have the view process before the table because it has a dependency which requires the table to be in place before the view can be processed. Otherwise the initial snapshot at the subscriber will fail. Hopefully that makes sense.

Well in the case with the two views, they needed to be processed before the initial referencing view can be applied. The quick and dirty work-a-round was to remove the article and reapply it, then generate a new snapshot. Honestly this is really not an acceptable practice even if you have a small subscriber base of three and especially not acceptable when dealing with subscriber numbers that exceed 200+. The problem with this is that if we introduce new subscribers they will fail at the initial snapshot delivery at this very article because of its dependencies. However the process works fine when you do a re-initialization with an existing subscriber.

So a method to rectify this issue is to identify the article processing order by running the sp_HelpMergeArticle as followed:

USE [AdventureWorks]
exec sp_HelpMergeArticle

It returned a result set of all the articles including the name (article name), upload_option (sync direction), processing_order and such. The information proved to be very useful as it identified the values for the processing_order were all set to zero. However in the event that you want information returned about a specific article then the following statement will do just that.

USE [AdventureWorks]
exec sp_HelpMergeArticle @publication = 'AdventureWorks_publication'
     ,@article = 'AddressType'

Well the default processing_order values were all set to zero based on the first query. Since all were set to zero the article would always revert back to its original position. Luckily there is an option to change the article by using the sp_ChangeMergeArticle store procedure:

USE [AdventureWorks]
exec sp_ChangeMergeArticle @publication = 'AdventureWorks_publication'
     ,@article = 'AddressType'
     ,@property = 'processing_order'
     ,@value = 10
     ,@force_invalidate_snapshot = 1
     ,@force_reinit_subscription = 0

Just to make sure it worked I re-ran the first query and sure enough the article in question dropped down to the very last position in the results set now that it’s processing_order value was set to 10. The real test revealed the true results when I staged a new subscriber by dropping the local database and clearing out the accounts. The delivery of the initial snapshot processed successfully without any further errors. Now I can rest better knowing that their support staff can add subscribers without running into this issue.