Formating Date and Time

We synchronize GIS data differentially using a proprietary method. As part of the process a date-time parameter is passed and the system updates the GIS accordingly given the time stamp value. Since this is a differential sync only the changes that occurred within the range are applied to the subscriber. A specific format is required but none of the following formats in the example below meet the criteria exactly.

Format 101 addresses the needed date style (MM/DD/YYYY) but there really is not a time format that fits my exact needs. This is where I improvise a bit. Since I pull values from a table that have a “datetime” data type I will break the date and time into two pieces and address each as needed. In this case I only need to worry about the time.

SELECT CONVERT(varchar, GETDATE(), 100) 'Mon DD YYYY hh:miAM'
	  ,CONVERT(varchar, GETDATE(), 101) 'MM/DD/YYYY'
	  ,CONVERT(varchar, GETDATE(), 102) 'YYYY.MM.DD'
	  ,CONVERT(varchar, GETDATE(), 103) 'DD/MM/YYYY'
	  ,CONVERT(varchar, GETDATE(), 104) 'DD.MM.YYYY'
	  ,CONVERT(varchar, GETDATE(), 105) 'DD-MM-YYYY'
	  ,CONVERT(varchar, GETDATE(), 106) 'DD Mon YYYY'
	  ,CONVERT(varchar, GETDATE(), 107) 'Mon DD, YYYY'
	  ,CONVERT(varchar, GETDATE(), 108) 'hh:mi:ss' -- 24HR
	  ,CONVERT(varchar, GETDATE(), 109) 'Mon DD YYYY hh:mi:ss:mmmAM'
	  ,CONVERT(varchar, GETDATE(), 110) 'MM-DD-YYYY'
	  ,CONVERT(varchar, GETDATE(), 111) 'YYYY/MM/DD'
	  ,CONVERT(varchar, GETDATE(), 112) 'YYYYMMDD'
	  ,CONVERT(varchar, GETDATE(), 113) 'DD Mon YYYY hh:mi:ss:mmm' --24HR
	  ,CONVERT(varchar, GETDATE(), 114) 'hh:mi:ss.mmm' -- 24HR
	  ,CONVERT(varchar, GETDATE(), 120) 'YYYY-MM-DD hh:mi:ss' --24HR
	  ,CONVERT(varchar, GETDATE(), 121) 'YYYY-MM-DD hh:mi:ss.mmm' --24HR
	  ,CONVERT(varchar, GETDATE(), 126) 'YYYY-MM-DDThh:mi:ss.mmm'
	  ,CONVERT(varchar, GETDATE(), 127) 'YYYY-MM-DDThh:mi:ss.mmmZ'
	  ,CONVERT(varchar, GETDATE(), 130) 'DD Mon YYYY hh:mi:ss:mmmAM'
	  ,CONVERT(varchar, GETDATE(), 131) 'DD/MM/YY hh:mi:ss:mmmAM'

At first I spotted format 108 which provided hh:mi:ss but it lacked the AM/PM designation but appending it would be simple. So here is what I did.

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,101) 'Date Only'
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35)

Note: I had to update AddressID 35 for illustration purposes.

Outcome
FirstAttempt

The end results are close but not exactly there yet. The parameter only accepts time in a 12 hour format not 24. So I turned to my old scripts to see if I have done this before and I hadn’t but I did use DATEPART in one of them. After jumping into BOL and reading up on DATEPART I found that I could break apart the time into smaller segments (hours, minutes, seconds, etc…) and format them individually. It would require more syntax but the end results would be exactly what I needed.

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,101) 'Date Only'
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
	   ,DATEPART(HH,ModifiedDate) 'HH'
	   ,DATEPART(MI,ModifiedDate) 'MI'
	   ,DATEPART(SS,ModifiedDate) 'SS'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35)

Outcome
Attempt2

Now the problem is that the MI column only displays a single digit when the value is less than 10. This would be the same behavior for the HH and SS columns. So I modified AddressID 36 to illustrate this as well.

Outcome
Attempt3

This means I need to do two things: 1) change the time to return in a 12 hr format and 2) pad the HH, MI & SS columns with a leading zero. I’ll tackle item 2) first by adding a space left of number…

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,101) 'Date Only'
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
	   ,str(DATEPART(HH,ModifiedDate),2) 'HH'
	   ,str(DATEPART(MI,ModifiedDate),2) 'MI'
	   ,str(DATEPART(SS,ModifiedDate),2) 'SS'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35,36)

Outcome
Attempt4

Now I need to replace the space with a zero…

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,101) 'Date Only'
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
	   ,REPLACE(str(DATEPART(HH,ModifiedDate),2),SPACE(1),0) 'HH'
	   ,REPLACE(str(DATEPART(MI,ModifiedDate),2),SPACE(1),0) 'MI'
	   ,REPLACE(str(DATEPART(SS,ModifiedDate),2),SPACE(1),0) 'SS'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35,36)

Outcome
Attempt5

Now we are looking better, but I still need to convert the 24 hr time to 12 hr time. So I’ll take the HH column and subtract 12 from it for any value greater than 12. If it is less than 12 then display as normal.

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,101) 'Date Only'
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
	   ,CASE 
	      WHEN (DATEPART(HH,ModifiedDate) > 12)
	      THEN REPLACE(str(DATEPART(HH,ModifiedDate)-12,2),SPACE(1),0)
	      ELSE REPLACE(str(DATEPART(HH,ModifiedDate),2),SPACE(1),0)
	   END AS 'HH'
	   ,REPLACE(str(DATEPART(MI,ModifiedDate),2),SPACE(1),0) 'MI'
	   ,REPLACE(str(DATEPART(SS,ModifiedDate),2),SPACE(1),0) 'SS'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35,36)

Outcome
Attempt6

Zinga! Now the only thing left to do is to concatenate the Date Only field with the HH, MI and SS fields and append the AM/PM designation.

SELECT [ModifiedDate]
       ,CONVERT(varchar,ModifiedDate,108) 'Time Only'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
       CASE 
	     WHEN CONVERT(varchar(2),ModifiedDate,108) > 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' PM'
	     WHEN CONVERT(varchar(2),ModifiedDate,108) < 12
	     THEN CONVERT(varchar,ModifiedDate,108)+ ' AM'					
       END AS 'New ModifiedDate'
       ,CONVERT(varchar,ModifiedDate,101) + ' ' +
	   CASE 
	      WHEN (DATEPART(HH,ModifiedDate) > 12)
	      THEN REPLACE(str(DATEPART(HH,ModifiedDate)-12,2),SPACE(1),0)
	      ELSE REPLACE(str(DATEPART(HH,ModifiedDate),2),SPACE(1),0)
	   END + ':' +
	   REPLACE(str(DATEPART(MI,ModifiedDate),2),SPACE(1),0) + ':' +
	   REPLACE(str(DATEPART(SS,ModifiedDate),2),SPACE(1),0) +
	   CASE 
	      WHEN (DATEPART(HH,ModifiedDate) < 12)
	      THEN ' AM'
	      ELSE ' PM'
	   END AS 'Usable Date'
FROM [AdventureWorks].[Person].[Address]
WHERE [AddressID] IN (34,35,36)

Final Outcome
FinalAttempt

Advertisements

Using xp_cmdshell to pass parameters to vbs

For an existing client running SQL Server 2005 Standard there was a need to take values from a result set and pass them to a vbs that resided on the server. In my post “Formating Date and Time” I ran through the steps of formatting a datetime data type to meet my needs.

I chose to pursue a pure tsql based option but there are other avenues which I could have followed such as using a pure VBS approach recommended by Robert Davis (blog | @sqlsoldier) which I have done in the past and worked quite well. Not sure why it didn’t come to mind as it is a viable solution first. Then Dave Levy (blog | @Dave_Levy) suggested powerShell. Which is something I actually haven’t worked with yet. Interesting as #powerShell sounds I had to produce something ASAP, but in the process I would research #powerShell nonetheless.

So I chose a TSQL path using the extended stored procedure xp_cmdshell to accomplish the task at hand. To get started xp_cmdshell needs to be enabled, but first you might want to turn on advanced options to see if its is already on.

TSQL

EXECUTE SP_CONFIGURE

Outcome
StandardOptions

As you can see the results set are limited so in order to see information about xp_cmdshell we need to switch on the advanced options… so let’s get to it.

EXECUTE SP_CONFIGURE 'show advanced options', 1
RECONFIGURE WITH OVERRIDE
GO
EXECUTE SP_CONFIGURE
GO

Being in alphabetic order you’ll need to scroll all the way down to see if the config_value is set to one. If it is set to zero then it’s disabled. In that event we need to enable it.

Outcome
02.AdvOptions_v0.01

That is done by setting the value to one. This is simply done by running the following:

EXECUTE SP_CONFIGURE 'xp_cmdshell', '1'
RECONFIGURE WITH OVERRIDE
GO
EXECUTE SP_CONFIGURE
GO

Now that xp_cmdshell is enabled we will see a value of one in the config_value field.

Outcome
03.xp_cmdshell_enabled

To test it let’s run the following:

DECLARE @results INT
DECLARE @cmd nvarchar(255)
SET @cmd = 'ping sqlsamson8' + ' -n 1 | find /i "reply"'

EXEC @results = master..xp_cmdshell @cmd
IF (@results = 0)
	SELECT 'ONLINE'
ELSE
	SELECT 'OFFLINE'

I don’t have a system called sqlsamson8 so ‘OFFLINE’ is returned.

Outcome
04.TestRun

Now let’s test this with a system that does exist.

DECLARE @results INT
DECLARE @cmd nvarchar(255)
SET @cmd = 'ping sqlsam7' + ' -n 1 | find /i "reply"'

EXEC @results = master..xp_cmdshell @cmd
IF (@results = 0)
	SELECT 'ONLINE'
ELSE
	SELECT 'OFFLINE'

Sure enough it’s ‘ONLINE’.

Outcome
05.TestRun

This is cool but I need some form of log to be generated. In short a file needs to be created and written to if the device was found on or offline.

DECLARE @results INT, @onLog NVARCHAR(50),@offLog NVARCHAR(50)
DECLARE @cmd NVARCHAR(255), @mn NVARCHAR(50), @cmd1 NVARCHAR(255)

-- Output results to file
SET @onLog = 'D:\xp_cmdshell\Online.txt'
SET @offLog = 'D:\xp_cmdshell\Offline.txt'
-- set target computer name
SET @mn = 'sqlsamson8'
-- set print process
SET @cmd = 'ping ' + @mn + ' -n 1 | find /i "reply"'
-- Execute ping process
EXEC @results = master..xp_cmdshell @cmd
IF (@results = 0)
	BEGIN
		-- Display online message to grid results
		SELECT @mn + ' IS ONLINE!'
		-- If device is online log computer name to output file
		SET @cmd1 =  'echo ' + @mn + ' >> ' + @onLog
		-- Execute statement
		EXEC master..xp_cmdshell @cmd1
	END
ELSE
	BEGIN
		-- Display offline message to grid results
		SELECT @mn + ' IS OFFLINE'
		-- If device is offline log computer name to output file
		SET @cmd1 =  'echo ' + @mn + ' >> ' + @offLog
		-- Execute statement
		EXEC master..xp_cmdshell @cmd1
	END	

Outcome
06.TestRun_WithOutputLog

Perfect. Exactly what I needed. However I need to run through a series of devices and having to manually specify a computer name is out of the question. I could use a CURSOR, but today I feel in a WHILE loop kind of mode…

DECLARE @Cnt INT, @mn NVARCHAR(50), @newmn NVARCHAR(50)
DECLARE @cmd NVARCHAR(255), @results INT, @cmd1 NVARCHAR(255)
DECLARE @offLog NVARCHAR(50), @onLog NVARCHAR(50)

SET @onLog = 'D:\xp_cmdshell\ONLINE.txt'
SET @offLog = 'D:\xp_cmdshell\OFFLINE.txt'
SET @mn = 'sqlsam'
SET @Cnt = 0

WHILE @Cnt < 10
BEGIN
	SET @Cnt = @Cnt + 1
	-- take @mn and append @Cnt (i.e. sqlsam + 1 = sqlsam1)
	SET @newmn = @mn + CONVERT(VARCHAR,@Cnt)
	SET @cmd = 'ping ' + @newmn + ' -n 1 | find /i "reply"'
	EXEC @results = master..xp_cmdshell @cmd
		IF (@results = 0)
			BEGIN
				-- Display online message to grid results
				PRINT @newmn + ' IS ONLINE!'
				-- If device is online log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @onLog
				-- Execute statement
				EXEC master..xp_cmdshell  @cmd1
			END
		ELSE
			BEGIN
				-- Display offline message to grid results
				PRINT @newmn + ' IS OFFLINE'
				-- If device is offline log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @offLog
				-- Execute statement
				EXEC master..xp_cmdshell @cmd1
			END
END

Great… loop and ping though the devices and record to logs accordingly. Bingo!

Outcome
07.TestRunLoop_WithOutputLog

Well there is one problem… if I run this again the results will be appended to the existing log essentially doubling the results with the same data. This is completely avoidable if I append a date-time stamp to the log file names!

DECLARE @Cnt INT, @mn NVARCHAR(50), @newmn NVARCHAR(50)
DECLARE @cmd NVARCHAR(255), @results INT, @cmd1 NVARCHAR(255)
DECLARE @offLog NVARCHAR(50), @onLog NVARCHAR(50),@dtStamp NVARCHAR(50)

SET @dtStamp =	CONVERT(varchar, GETDATE(), 112) + '_' + 
								REPLACE(CONVERT(varchar(5), GETDATE(), 108),':','')

SET @onLog = 'D:\xp_cmdshell\ONLINE_' + @dtStamp +'.txt'
SET @offLog = 'D:\xp_cmdshell\OFFLINE_' + @dtStamp +'.txt'

SET @mn = 'sqlsam'
SET @Cnt = 0

WHILE @Cnt < 10
BEGIN
	SET @Cnt = @Cnt + 1
	-- take @mn and append @Cnt (i.e. sqlsam + 1 = sqlsam1)
	SET @newmn = @mn + CONVERT(VARCHAR,@Cnt)
	SET @cmd = 'ping ' + @newmn + ' -n 1 | find /i "reply"'
	EXEC @results = master..xp_cmdshell @cmd
		IF (@results = 0)
			BEGIN
				-- Display online message to grid results
				PRINT @newmn + ' IS ONLINE!'
				-- If device is online log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @onLog
				-- Execute statement
				EXEC master..xp_cmdshell  @cmd1
			END
		ELSE
			BEGIN
				-- Display offline message to grid results
				PRINT @newmn + ' IS OFFLINE'
				-- If device is offline log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @offLog
				-- Execute statement
				EXEC master..xp_cmdshell @cmd1
			END
END

Outcome
08.TestRunLoop_WithOutputLog_TimeStamp

Now I only need to add the VBS path, execute it and then we are golden!

DECLARE @Cnt INT, @mn NVARCHAR(50), @newmn NVARCHAR(50), @vbscmd NVARCHAR(50)
DECLARE @cmd NVARCHAR(255), @results INT, @cmd1 NVARCHAR(255)
DECLARE @offLog NVARCHAR(50), @onLog NVARCHAR(50),@dtStamp NVARCHAR(50)

SET @dtStamp =	CONVERT(varchar, GETDATE(), 112) + '_' + 
								REPLACE(CONVERT(varchar(5), GETDATE(), 108),':','')

SET @onLog =  'D:\xp_cmdshell\ONLINE_' + @dtStamp +'.txt'
SET @offLog =  'D:\xp_cmdshell\OFFLINE_' + @dtStamp +'.txt'


SET @mn = 'sqlsamson'
SET @Cnt = 0

WHILE @Cnt < 10
BEGIN
	SET @Cnt = @Cnt + 1
	-- take @mn and append @Cnt (i.e. sqlsam + 1 = sqlsam1)
	SET @newmn = @mn + CONVERT(VARCHAR,@Cnt)
	SET @cmd = 'ping ' + @newmn + ' -n 1 | find /i "reply"'
	EXEC @results = master..xp_cmdshell @cmd
		IF (@results = 0)
			BEGIN
			SET @vbscmd = 'cscript /nologo D:\test.vbs ' + '"' + @newmn + '"'
				-- Display online message to grid results
				PRINT @newmn + ' IS ONLINE!'
				-- If device is online log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @onLog
				-- Execute write statement
				EXEC master..xp_cmdshell @cmd1
				-- Execute vbs statement
				PRINT @vbscmd
				EXEC master..xp_cmdshell @vbscmd
			END
		ELSE
			BEGIN
				-- Display offline message to grid results
				PRINT @newmn + ' IS OFFLINE'
				-- If device is offline log computer name to output file
				SET @cmd1 =  'echo ' + @newmn + ' >> ' + @offLog
				-- Execute statement
				EXEC master..xp_cmdshell @cmd1
			END
END

Outcome
09.ParameterPassedToVBS

Of course this only works when the naming convention is standardized and the numbers are sequentially sound. I could create a temp table to pull in all the device names and add a ID field that auto increments by a value of one. Then run through the loop selecting the data from temp table where ID = @Cnt. That is a feasible option.

Get them Server Properties

Though this may be trivial it is always useful. There are many ways to determine what version, edition and service pack level of SQL Server you are running. So here is a one way to get the information you need.

TSQL

SELECT  RIGHT(LEFT(@@VERSION,25),15)        AS [Product]
	    ,SERVERPROPERTY('productversion')   AS [Version]
	    ,SERVERPROPERTY('edition')          AS [Edition]
	    ,SERVERPROPERTY('productlevel')     AS [Service Pack]
        ,SERVERPROPERTY('ServerName')       AS [ServerName]

Results

5-26-2010 10-42-34 PM

Note:

If the char length of SQL Server 2008 changes then (line 1) will need to be adjusted accordingly. I haven’t searched for a better solution; however, if you know of one please feel free to pass on the information. I would appreciate it.

Updated: 2010-06-05

Recently as I was working more with CHARINDEX I thought I might use to extract the product from @@VERSION. The number 15 is the char length of SQL Server 2008. Hence the 15 of course.

SELECT SUBSTRING(@@VERSION,CHARINDEX('SQL',@@VERSION),15) 'Product'

It will return SQL Server 2008. Again if the product name changes CHAR length then this will be off.

Reporting Services Date Parameters

Being the guy that I am I often try to implement pure TSQL solutions especially when working with parameters via Reporting Services. I know it seems like they go hand-in-hand and to a great extent they do, but there are some cases where it is best to use expressions for the report parameters over a stored procedure. Don’t get me wrong my TSQL approach worked fine but when I used the expression I was able to accomplish the exact same results with slightly less syntax to write and possibly less overhead. I stumbled upon the blog post “Calculating the first and last day of the month” by datageek on blogspot which illustrates the expressions I used for my report.

TSQL Syntax

SELECT DATEADD(Month,DATEDIFF(Month,0,getdate()),-1)+1 'StartDate'
SELECT DATEADD(Month,DATEDIFF(Month,0,getdate())+1,-1) 'EndDate'

RS Expression Syntax

Note: The following can be used for Access as well to get the first and last days of the month from what I understand.

First Day of the Month

=DateSerial(Year(Now), Month(Now), 1)

Last Day of the Month

=DateSerial(Year(Now), Month(Now) + 1, 0)

So for illustration purposes I put together a report based on the AdventureWorks database and used the RS Expression Syntax to populate the date fields, but first let’s create the stored procedure.

Stored Procedure

CREATE PROCEDURE sp_RP_GetEmployeeData
(
	@StartDate datetime
	,@EndDate datetime
)

AS

SELECT	e.EmployeeKey
	--, e.ParentEmployeeKey
	--, e.EmployeeNationalIDAlternateKey
	--, e.ParentEmployeeNationalIDAlternateKey
	--, e.SalesTerritoryKey
	, e.FirstName
	, e.LastName
	--, e.MiddleName
	--, e.NameStyle
	, e.Title
	, e.HireDate
	--, e.BirthDate
	--, e.LoginID
	--, e.EmailAddress
	, e.Phone
	--, e.MaritalStatus
	--, e.EmergencyContactName
	--, e.EmergencyContactPhone
	, e.SalariedFlag
	--, e.Gender
	--, e.PayFrequency
	, e.BaseRate
	--, e.VacationHours
	--, e.SickLeaveHours
	--, e.CurrentFlag
	--, e.SalesPersonFlag
	--, e.DepartmentName
	--, e.StartDate
	, e.EndDate
	--, e.Status
	--, st.SalesTerritoryCountry
	--, st.SalesTerritoryRegion
	--, st.SalesTerritoryGroup

FROM AdventureWorksDW2008.dbo.DimEmployee AS e
JOIN AdventureWorksDW2008.dbo.DimSalesTerritory AS st
ON e.SalesTerritoryKey = st.SalesTerritoryKey
WHERE (HireDate BETWEEN @StartDate AND @EndDate)
ORDER BY e.HireDate

Now that we have the sProc in place building the report is pretty much the same for the most part; however, the report parameters have relocated. They are now found on the left hand side under the report data section. In 2005 you would have found the parameters under the Report menu.

Now to set default values for the parameters

1) Right click on the StartDate parameter and select “Parameter Properties”

5-26-2010 8-16-31 AMa

2) At the Properties screen click on “Default Values”

5-26-2010 8-25-56 AM

3) Then select “Specify Values”

5-26-2010 8-41-10 AM

4) Click on the “Expression” button

5-26-2010 8-47-24 AM

5) Use the RS Expression Syntax for the “First Day of the Month”

5-26-2010 9-01-28 AM

6) Now follow the steps 1-5 and set the default value for the EndDate parameter but this time use the RS Syntax for the “Last Day of the Month”

5-26-2010 9-07-08 AM

7) Let’s preview the report, there won’t be any data though but the parameters are defaulting as intended

5-26-2010 9-09-08 AM

To see the data simply change the date range to reflect 1/1/1996 to 7/1/1999 then press “View Report”

5-26-2010 9-15-54 AM

1) If you just want to see the date only and not the time just right click on the field and select “Text Box Properties”

5-26-2010 1-45-59 PM

2) In the Text Box Properties dialog select Number

5-26-2010 1-49-19 PM

3) From within the Category section select “Date”

5-26-2010 1-49-43 PM

4) Then select the format under “Type”

5-26-2010 1-50-09 PM

And here is the outcome…

5-26-2010 4-12-53 PM

SQL Server 2008 Licensing

Today I had an in-depth conversation about SQL Server Licensing with Bryan Friedly who is an Account Manager for Microsoft in Texas. Buck Woody who is a Sr. SQL Server Technical Specalist for Microsoft (Website | Twitter) simplified it with his MSDN Article: “Simplied SQL Server Licensing” and I wanted to extend on it just a bit with what I had learned.

At first I originally thought that licenses were based on connections (concurrent connections for that matter) and as it turned out I was wrong (imagine that). Microsoft is not really concerned with what technologies you are accessing SQL Server with; rather, their concern is focused on who or what device will access data from SQL Server.

So when you think of licensing scenarios think of the people and then think of the devices that the people will use to access the data. A person and a device are considered accessible end-points so-to-speak which can somewhat be translated into needed CALs depending on the scenario.

It can get sticky with shared workspaces which is when multiple people share a common system which includes laptops, mobile devices, desktops, tablets, etc… In hopes to clarify I will run through a few examples to help illustrate what I learned so far or at the very least my level of understanding on how the licensing works. So to start let me touch on the basics. With SQL Server 2008 there are two license models: Per Processor and Server/CAL and each have their own pricing segments.

Prices are based on retail not software assurance.

Edition Per Processor Server/CAL CAL
Datacenter $54,999.00 N/A
Enterprise $27,495.00 $8592.00 $168.00
Standard $7,171.00 $898.00
Workgroup N/A $730.001 $148.00
Web $3,500.00 $1,751.00 N/A

1 includes 5 workgroup CALs

Microsoft Official SQL Server 2008 Pricing

Per Processor:

This is by far the simplest model to implement, but with simplicity comes a pretty hefty price tag which is entirely dependent upon the edition you purchase of course. With the “Per Processor” license model any client can connect from anywhere. This means you can have a single user or device to multiple users and devices that spread vastly beyond the Nth degree access SQL Server and still be in compliance under this licensing model. This is definitely ideal when your user or device base exceeds the plausibility of physical count. In short you have no idea how many users or devices are accessing SQL Server. So if you have a quad-socket (quad-core) server then you only need to license the four sockets and not the individual cores which is the opposite with Oracle. That doesn’t mean you need to license all four of the sockets if you plan on using SQL Server with just two processors then you need to purchase the “Per Processor” license times 2.

per-processor-model-2

Server/CAL:

This model can get a bit tricky but it does not necessarily have to be. The first thing you should know is that there are two types of CALs: Per User and Per Device. There is also another term you should become familiar with and that is: Multiplexing otherwise known as “pooling”. Pooling is an added layer between the user and SQL Server. Let’s use a web application which is a typical layer that sits between a person or device and the back-end SQL Server database.

multiplexing

Well in this scenario the application server which falls under (multiplexing) is not required to have a license. So that’s an added savings. You will however need to obtain licenses for every user that sends/retrieves data from the underlying SQL Server. So if you have 40 users using your web application then you will need to acquire 40 licenses.

server-cal-model

In the above illustration I have laid out my interpretation of the different CALs. As you can see the “Per User” CALs can access any and all of the SQL Severs, even at the same time. The “Per Device” is specific, so if your device needs to access two servers then two licenses are required. However “Per Device” does have it’s advantages especially in shared workspaces. If it is a common practice in your organization to share workstations then this is more of an economic approach. Especially if you have 25 workstations and 100 employees which means you only need to obtain 25 “Per Device” licenses to satisfy the license agreement. So let’s run through one more scenario.

We are planning to introduce a replicated environment to the organization where there will be 25 subscribers via mobile devices (five are shared between 15 users) and 35 desktop users (five shared between 10 users) that will access the database via a .NET Web Application. The client purchased a quad-socket (quad-core) server with the intention to load and operate SQL Server 2008. What edition should be purchase and what would be the most economic licensing option? Well SQL Server 2008 Standard Edition supports up to four processors and based on the hardware configuration the “Per Processor” model would cost upwards of $28,000, so we will need to use the “Server/CAL” method.

What do we know so far?

Qty Description
1 SQL Server Standard Edition
20 Laptop users/subscribers
30 Desktop users
10 Shared devices
25 Users to the shared devices
75 actual users
60 actual licensable end-points

It would be ideal to use the “Per User” CALs for the 20 laptop and 30 desktop users, then “Per Device” for the five shared laptops and the five shared desktops which saves us from purchasing 15 CALs ($2,520.00 savings). Now based on retail pricing we are looking at:

Qty Item Description Price
1 SQL Server 2008 Standard Edition $898.99
50 Per User CALs $168.00
10 Per Device CALs $168.00
Total: $10,978.00

If we went with just “Per User” CALs then the total accessible end-point CALs would increase by 15 to include each of the users that shared the laptops and desktops.

Qty Item Description Price
1 SQL Server 2008 Standard Edition $898.99
75 Per User CALs $168.00
Total: $13,498.00

Replication Process Order

Recently during a deployment I ran through a replication setup rehearsal in order to practice my processes. It is always good practice to run through an upgrade rehearsal and document your steps, experiences and results. To give you a brief background in all of our deployments we use merge replication since we send and receive data between our server and mobile subscribers.

As I ran through the article selection process I selected the appropriate tables, indicated their sync direction (download only or bi-directional), followed by the stored procedures, views and ended with the functions as usual. We didn’t have an indexed views otherwise they would have went before the functions. Little did I realize that a view would so happen to reference two other views, meaning that in order for the initialization to complete successfully the two referenced views would need to be created first. The rule “you cannot put the carriage before the horse” definitely applies here.

So to clarify here is a simple scenario: you have a table article and a view which directly references the specific table… simple enough. Well you cannot have the view process before the table because it has a dependency which requires the table to be in place before the view can be processed. Otherwise the initial snapshot at the subscriber will fail. Hopefully that makes sense.

Well in the case with the two views, they needed to be processed before the initial referencing view can be applied. The quick and dirty work-a-round was to remove the article and reapply it, then generate a new snapshot. Honestly this is really not an acceptable practice even if you have a small subscriber base of three and especially not acceptable when dealing with subscriber numbers that exceed 200+. The problem with this is that if we introduce new subscribers they will fail at the initial snapshot delivery at this very article because of its dependencies. However the process works fine when you do a re-initialization with an existing subscriber.

So a method to rectify this issue is to identify the article processing order by running the sp_HelpMergeArticle as followed:

USE [AdventureWorks]
exec sp_HelpMergeArticle

It returned a result set of all the articles including the name (article name), upload_option (sync direction), processing_order and such. The information proved to be very useful as it identified the values for the processing_order were all set to zero. However in the event that you want information returned about a specific article then the following statement will do just that.

USE [AdventureWorks]
exec sp_HelpMergeArticle @publication = 'AdventureWorks_publication'
     ,@article = 'AddressType'

Well the default processing_order values were all set to zero based on the first query. Since all were set to zero the article would always revert back to its original position. Luckily there is an option to change the article by using the sp_ChangeMergeArticle store procedure:

USE [AdventureWorks]
exec sp_ChangeMergeArticle @publication = 'AdventureWorks_publication'
     ,@article = 'AddressType'
     ,@property = 'processing_order'
     ,@value = 10
     ,@force_invalidate_snapshot = 1
     ,@force_reinit_subscription = 0

Just to make sure it worked I re-ran the first query and sure enough the article in question dropped down to the very last position in the results set now that it’s processing_order value was set to 10. The real test revealed the true results when I staged a new subscriber by dropping the local database and clearing out the accounts. The delivery of the initial snapshot processed successfully without any further errors. Now I can rest better knowing that their support staff can add subscribers without running into this issue.

Identifying your Recovery Model

Identifying your recovery model is relatively simple and can be conducted in several ways. In this post I will illustrate the process of identifying your recovery model via the UI and with TSQL. Understanding the differences between the three model is important so here are some informative MSDN links that get pretty detailed in explaining each model.

Recovery Models: SimpleFullBulk-Logged.

Using the UI

Step 1:  Right click on the target database and select “Properties”

01.RightClickSelectProperties

Step 2: Select “Options”

02.SelectOptions

Step 3: Select the appropriate recovery model

03.ChooseRecoveryModel

As you can see it is very easy, but in order for you to determine the model for each database you would have to repeat these steps for each database. Not the most efficient way to spend your day. Now let’s get them all in one shot.

With TSQL

Step 1: From within the SQL Server Management Studio start a new query

04.NewQueryWindow

Step 2: Run the following statement

SELECT name
       ,recovery_model_desc
FROM sys.databases
ORDER by name

Here are the results

06.Results

Change Recovery Model via TSQL

Step 1: Run the following statement

ALTER DATABASE AdventureWorks
SET RECOVERY MODEL FULL -- FULL, SIMPLE, BULK_LOGGED

Here are the results after re-running the select statement

08.AlterResults