Tuesday, November 11, 2014

Importing SQLPS module results in Warnings

Update (June 2015): After originally posting this back in Nov, and debugging the issue earlier in the year I finally got around to recording my findings and updating this blog post. Hopefully this post also provides some insight into methods for Debugging PowerShell when faced with Provider issues.

I've been meaning to blog about this for a while now. If you do any work with PowerShell, SQL Server and Azure then chances are you have noticed that when you import the SQL module it reports the errors:

Errors as per below
WARNING: Could not obtain SQL Server Service information. An attempt to connect to WMI on
'Microsoft.WindowsAzure.Commands.SqlDatabase.Types.ps1xml' failed with the following error: The RPC server is
unavailable. (Exception from HRESULT: 0x800706BA)
WARNING: Could not obtain SQL Server Service information. An attempt to connect to WMI on
'Microsoft.WindowsAzure.Commands.SqlDatabase.Types.ps1xml' failed with the following error: The RPC server is
unavailable. (Exception from HRESULT: 0x800706BA)
...
WARNING: Could not obtain SQL Server Service information. An attempt to connect to WMI on
'Microsoft.WindowsAzure.Commands.SqlDatabase.dll' failed with the following error: The RPC server is unavailable.
(Exception from HRESULT: 0x800706BA)
WARNING: The names of some imported commands from the module 'sqlps' include unapproved verbs that might make them less
 discoverable. To find the commands with unapproved verbs, run the Import-Module command again with the Verbose
parameter. For a list of approved verbs, type Get-Verb.

The following Modules are affected by the issues described within this blog.

Module Version Notes
SQLPS 1.0 SQL 2014 release
SQLASCMDLETS 1.0 SQL 2014 release
Azure 0.8.8 Azure SDK June 2014 Release

Firstly a warning, do not modify any of the files mentioned in this blog as I will be exploring the code which runs when you import a module, but these are signed and any modifications may break the module functionality.

People generally notice this error when they run:

Import-Module SQLPS

What happens when you run this command is PowerShell loads up the Module Mainfest C:\Program Files (x86)\Microsoft SQL Server\120\Tools\PowerShell\Modules\SQLPS\SQLPS.PSD1


This references a number of other code bases.

Firstly a number of nested modules, most significantly both "Microsoft.SqlServer.Management.PSSnapins.dll" and "Microsoft.SqlServer.Management.PSProvider.dll" which load up the SQLPS provider and set the PSDrive path to the path of PS SQLSERVER:>

It also loads up the script file SqlPsPostScript.PS1 in the same path..... It is the code in this script where the warning is generated from (but this is not the cause).

This script has a step which is designed to import the SQLASCmdlets module and it is this step which is throwing the warnings. However, this isn't really to do with the loading of the SQL module but due to the Get-Module command which is executed.
#Load the SQLAS Module
$m = Get-Module -ListAvailable | where {$_.Name -eq "SQLASCmdlets"}
if($m -ne $null) { Import-Module $m -Global }

After a bit of troubleshooting I found that the problem even happens if you just run Get-Module -ListAvailable from within the SQLPS provider.



However if you run Get-Module -ListAvailable from another Provider (e.g. FileSystem) this does not return the error.



So the culprit is within the way the SQLPS Provider (via the PSDrive) implements standard cmdlets. You can see this even with Import-Module cmdlet when it specifically tries to work with the Azure module when the path is set to SQLSERVER:\.



Now typically this is where most would get stuck as the SqlPs and Azure modules are Binary Modules (aka it loads .dlls) so the general public cannot dig into the code.

However, here we can leverage PowerShell debugging to go deep into the code stack. So the actions that I took next were:

1) Use the -Debug switch parameter on the Import-Module
Import-Module -Name SqlPs -Debug

2) Use Trace-Command
Trace-Command -Expression {Import-Module SQLPS} -FilePath C:\ImportModuleTraceCommand.txt -Name * -Option All

Using the output captured by these commands we can actually trace the problem down further in the stack.

Firstly by looking at the Debug output we can see the following function calls just before the WMI errors.

---------------------------------------------------------------------------
DEBUG: ItemExists: SQLSERVER:\Services\Microsoft.WindowsAzure.Commands.Websites.Types.ps1xml
DEBUG: ItemExists: SQLSERVER:\Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.Types.ps1xml
WARNING: Could not obtain SQL Server Service information. An attempt to connect to WMI on 'Microsoft.WindowsAzure.Commands.S
qlDatabase.Types.ps1xml' failed with the following error: The RPC server is unavailable. (Exception from HRESULT: 0x800706BA
)
---------------------------------------------------------------------------

This tells us that the module loading is calling the ItemExists function to test a path for a file it is told to load (and we already know that the path is coming from the definition of the Azure module).

Next we can look at the Trace-Command data.

If you compare the logical code variations between the two extracts you can clearly see that the first resolves the path to the FILESYSTEM provider but the second resolves the path to the SQLSERVER provider. If we look even more closely we see a function call output by "Path is DRIVE-QUALIFIED" which returns true on the first and false on the second, and because it returns false on the second it then sets the SQLPROVIDER to go checking for the paths off the base drive which in this case is SQLSERVER:, and eventually fails with "ERROR: Item does not exist".


Successful loading of types file:

PathResolution Information: 0 :                                                     Resolving MSH path "C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.Websites.Types.ps1xml" to MSH path
...
PathResolution Information: 0 :                                                         Path is DRIVE-QUALIFIED
LocationGlobber Information: 0 :  WriteLine                                                           result = True
LocationGlobber Information: 0 :  WriteLine                                                           Drive Name: C
SessionState Information: 0 :  WriteLine                                                           Drive found in scope 1
LocationGlobber Information: 0 :  WriteLine                                                           path = Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.Websites.Types.ps1xml
CmdletProviderClasses Information: 0 :  WriteLine                                                           result=Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.Websites.Types.ps1xml
CmdletProviderClasses Information: 0 :  WriteLine                                                           result=C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.Websites.Types.ps1xml
SessionState Information: 0 :  WriteLine                                                           result = C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.Websites.Types.ps1xml
CmdletProviderClasses Information: 0 :  WriteLine                                                           basePath = C:\
...
PathResolution Information: 0 :                                                         DRIVE-RELATIVE path: Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\Services\Microsoft.WindowsAzure.Commands.Websites.Types.ps1xml
PathResolution Information: 0 :                                                         Drive: C
PathResolution Information: 0 :                                                         Provider: Microsoft.PowerShell.Core\FileSystem
...
PathResolution Information: 0 :                                                         RESOLVED PATH: C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\Services\Microsoft.WindowsAzure.Commands.Websites.Types.ps1xml




Unsuccessful loading of types file:
PathResolution Information: 0 :                                                 Resolving MSH path ".\Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.Types.ps1xml" to PROVIDER-INTERNAL path
...
PathResolution Information: 0 :                                                     Path is DRIVE-QUALIFIED
LocationGlobber Information: 0 :  WriteLine                                                       result = False
LocationGlobber Information: 0 :  WriteLine                                                       path = Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.Types.ps1xml
CmdletProviderClasses Information: 0 :  WriteLine                                                       result=Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.Types.ps1xml
CmdletProviderClasses Information: 0 :  WriteLine                                                       result=SQLSERVER:\Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.Types.ps1xml
SessionState Information: 0 :  WriteLine                                                       result = SQLSERVER:\Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.Types.ps1xml
CmdletProviderClasses Information: 0 :  WriteLine                                                       basePath = SQLSERVER:\
...
PathResolution Information: 0 :                                                     DRIVE-RELATIVE path: Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.Types.ps1xml
PathResolution Information: 0 :                                                     Drive: SQLSERVER
PathResolution Information: 0 :                                                     Provider: Microsoft.SqlServer.Management.PSProvider\SqlServer
...
PathResolution Information: 0 :                                                     RESOLVED PATH: SQLSERVER:\Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.Types.ps1xml
...
PathResolution Information: 0 :                                                             ERROR: Item does not exist: Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.Types.ps1xml




At this point I new there was something different happening in the implementation of a PathResolution process.

Taking in everything I had investigated I could conclude that the SQLPROVIDER was implementing the ItemExist function differently to the FileSystem provider. In particular the SQLPROVIDER was checking the given file path and trying to match it to the SQLSERVER: drive, and in this case because the next part of the path starts with "\Sql\....." this would cause it to think it is trying to get the path of a SQL Server in which it would then attempt to connect to the name (in our case the file name) as a Server Name to retrieve the instance details...... hence the WMI errors.

To the best of my knowledge this issue has been around since an update to the Azure module in early 2014, as they change the folder structure they were using. At the time of writing this post, both initially and subsequent updates, the problem still exists.

If you want to help get this resolved please vote on the MS Connect item: https://connect.microsoft.com/SQLServer/feedback/details/1420992/import-module-sqlps-may-take-longer-to-load-but-always-returns-warnings-when-the-azure-powershell-module-is-also-installed#



 Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.


Troubleshooting blocking locks in SQL Azure

I was faced with an interesting situation yesterday where a customer had created a blocking chain on their SQL Azure database. The scenario they described was:

A query took out and held a lock on object but the client connection closed and orphaned the session on the SQL Azure database. This resulted in other queries trying to access the object being blocked and timing out (even a SELECT *).

Given we do not have any graphical tools in SSMS to see current activity when you connect to a SQL Azure database, this offers a challenge for those that are use to going to Activity Monitor or the Reports to view this sort of information. Thankfully most of DMV's and DMF's have made their way into support on SQL Azure.

To troubleshoot the above scenario we then used the following queries to identify the culprit and eventually terminate it with the only option available.... KILL [spid]. To demonstrate this I have created the scenario on my Azure database through the use of PowerShell (but i am not sharing that as it is bad practice). The good news is that these queries will also work for your on-premise environments.

TSQL to identify the Blockers and Victims

WITH Blockers AS
    (select DISTINCT blocking_session_id as session_id
 from sys.dm_exec_requests
 where blocking_session_id > 0
)
SELECT 'Blocker' as type_desc
 , sys.dm_exec_sessions.session_id
 , sys.dm_exec_requests.start_time
 , sys.dm_exec_requests.status
 , sys.dm_exec_requests.command
 , sys.dm_exec_requests.wait_type
 , sys.dm_exec_requests.wait_time
 , sys.dm_exec_requests.blocking_session_id
 , '' AS stmt_text
FROM sys.dm_exec_sessions
LEFT JOIN sys.dm_exec_requests ON sys.dm_exec_requests.session_id = sys.dm_exec_sessions.session_id
INNER JOIN Blockers ON Blockers.session_id = sys.dm_exec_sessions.session_id
UNION
SELECT 'Victim' as type_desc
 , sys.dm_exec_sessions.session_id
 , sys.dm_exec_requests.start_time
 , sys.dm_exec_requests.status
 , sys.dm_exec_requests.command
 , sys.dm_exec_requests.wait_type
 , sys.dm_exec_requests.wait_time
 , sys.dm_exec_requests.blocking_session_id
 , ST.text AS stmt_text
FROM sys.dm_exec_sessions
INNER JOIN sys.dm_exec_requests ON sys.dm_exec_requests.session_id = sys.dm_exec_sessions.session_id
CROSS APPLY SYS.DM_EXEC_SQL_TEXT(sys.dm_exec_requests.sql_handle) AS ST
WHERE blocking_session_id > 0

The output of this query looks then like the following where you can clearly see the blocker and it's victims (in fact this example then has a 2nd layer of blocking).





TSQL to view the established locks within the current database

SELECT
 (CASE sys.dm_tran_locks.request_session_id
  WHEN -2 THEN 'ORPHANED DISTRIBUTED TRANSACTION'
  WHEN -3 THEN 'DEFERRED RECOVERY TRANSACTION'
  ELSE sys.dm_tran_locks.request_session_id
 END) AS session_id
 , DB_NAME(sys.dm_tran_locks.resource_database_id) AS database_name
 , sys.objects.name AS locked_obj_name
 , sys.dm_tran_locks.resource_type AS locked_resource
 , sys.dm_tran_locks.request_mode AS lock_type
 , ST.text AS stmt_text
 , sys.dm_exec_sessions.login_name AS login_name
 , sys.dm_exec_sessions.host_name AS host_name
 , sys.dm_tran_locks.request_status as request_status
FROM sys.dm_tran_locks
JOIN sys.objects ON sys.objects.object_id = sys.dm_tran_locks.resource_associated_entity_id
JOIN sys.dm_exec_sessions ON sys.dm_exec_sessions.session_id = sys.dm_tran_locks.request_session_id
JOIN sys.dm_exec_connections ON sys.dm_exec_connections.session_id = sys.dm_exec_sessions.session_id
CROSS APPLY SYS.DM_EXEC_SQL_TEXT(sys.dm_exec_connections.most_recent_sql_handle) AS st
WHERE sys.dm_tran_locks.resource_database_id = DB_ID()
ORDER BY sys.dm_tran_locks.request_session_id

The output of this query shows the various locks which have been established by each session (one-to-many in sessions-to-locks). From here we can see which locks will have taken priority and potentially blocking more queries in the future.



So while you may not have a graphical view of this information we can definitely use our trusted DMV's and DMF's to gain access to the relevant data. Who knows with the rate of change it can surely only be a matter of time until Activity Monitor and the standard reports are available to us in SSMS for SQL Azure.

Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Tuesday, September 9, 2014

A look at how PowerShell handles Case Sensitivity

I was asked today but an attendee at one of my PowerShell workshops about why sorting data by Case did not appear as expected. So I thought I would dig a bit deeper into that.

As we know PowerShell is Case Insensitive by default..... well most things. The CmdLets Select-Object and Get-Unqiue actually seem to be Case Sensitive..... so it's almost everything by default.

When it comes to sorting data with Sort-Object and the -CaseSensitive switch parameter you may also be surprised that this does not order in the way you might think.

Lets take a fairly obvious array of strings to start with by just changing an individual character at a time:
$Strings = "aabbcc","aabbcC","aabbCc","aaBbcc","aAbbcc","Aabbcc","aabBcc";
$Strings | Sort-Object -CaseSensitive;

This sorts as you would expect like:

aabbcc
aabbcC
aabbCc
aabBcc
aaBbcc
aAbbcc
Aabbcc

However if we take a more complicated array of strings where we not only increase an individual character but we also have different order of characters or different characters all together within the strings like this:
$Strings = "abc", "acc", "adc", "aec", "afc", "agc", "ahc","aBc", "aCc", `
"aDc", "aEc", "aFc", "aGc", "aHc", "Abc", "Acc", "Adc", "Aec", "Afc", "Agc", "Ahc";
$Strings | Sort-Object -CaseSensitive;

Then the behavior changes to what you may not expect and results in:

abc
aBc
Abc
acc
aCc
Acc
adc
aDc
Adc
aec
aEc
Aec
afc
aFc
Afc
agc
aGc
Agc
ahc
aHc
Ahc

Notice the fact that not all the strings starting with lowercase 'a' are together?

So what is happening here, well it seems that first all strings are sorted regardless of case (i.e. all the strings with 'AEC' appear together). Then PowerShell starts evaluating each character within the string individually. So in the case of the 3 strings "aec","aEc", "Aec", the all lowercase version wins outright, but then the next string with the first lowercase 'a' wins and finally the string with uppercase 'A' is displayed..... but why do these all beat 'afc" well because even though they all start with 'a', 'ae' beats 'af', and most importantly regardless of case.

Lets look at a slightly more complicated example to highlight this more:
$Strings = "abc", "acc", "adc", "aec", "afc", "agc", "ahc","aBc", "aCc", `
"aDc", "aEc", "aFc", "aGc", "aHc", "Abc", "Acc", "Adc", "Aec", "Afc", "Agc", "Ahc", `
"aab","abb","Aab","Abb","ABb", "AbB";
$Strings | Sort-Object -CaseSensitive;

Which returns:

aab
Aab
abb
Abb
AbB
ABb
abc
aBc

Abc
acc
aCc
Acc
adc
aDc
Adc
aec
aEc
Aec
afc
aFc
Afc
agc
aGc
Agc
ahc
aHc
Ahc

Play around with this, try sorting file names in C:\Windows\System32 with both lower and upper cases, and you will see this behavior.

So why is this important, well if you are just using CmdLets then it is only going to be the sorting in case sensitive when you will really notice this behavior. However if you are using the Case Sensitive Operators, particularly the -CLT, -CLE, -CGT, -CGE then you are definitely going to need to care about this as the outcome may not be as you expect. For example.

#You would expect the string 'abc' to be lower than 'ABb' due to the lowercase first character 'a', but in fact this returns False.
"abc" -CLT "ABb"

Which returns False

In fact there is no difference between the above and this as it too (as expected) will return False, but this help to highlight how PowerShell is applying it's logic.
"abc" -LT "abb"


So in summary, Case Sensitivity Ordering only applies when strings have the exact same characters, because PowerShell (or more accurately .Net) is first ordering by the characters regardless of case, and then sorting by case within that group of common characters.


If you want to know what CmdLets support the CaseSensitive parameter then use these statements

#find the other cmdlets with CaseSensitive param
Get-Module -ListAvailable; Get-Command -ParameterName CaseSensitive;

# If you don't want to list the available modules first (to read them into the session) then use this
Get-Command | %{try {if ($_.Parameters.Count -gt 0 -and $_.Parameters.Keys.Contains("CaseSensitive") -eq $true){"Found in: $($_.Name)"}} catch {} finally {}}



Just to extend this investigation further I fired up Visual Studio, created a basic Windows Form with a List Box and the following code.

//create the array
String[] sStrings = new string[] {"abc", "acc", "adc", "aec", "afc", "agc", "ahc","aBc", "aCc", "aDc", "aEc", "aFc", "aGc", "aHc", "Abc", "Acc", "Adc", "Aec", "Afc", "Agc", "Ahc", "aab","abb","Aab","Abb","ABb", "AbB"};

//Sort the values
Array.Sort(sStrings);

//add the strings to the list box 
listBox1.Items.Clear();
listBox1.Items.AddRange(sStrings);

This outputs the values in the same order as above, as in:

....
Aab
abb
Abb
AbB
ABb
abc
aBc

Abc
acc
aCc
Acc
....

UPDATE: According to one of my .Net Programmer  mates (thanks Steve), this is all known as the Lexicographical order....



Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Tuesday, July 22, 2014

The evolution of SQLPS CmdLets

The following table contains a list of all of the CmdLets and Functions released with SQLPS over the last few versions and highlights the evolution of the PowerShell support in SQL Server.

SQL 2008 R2 SQL 2012 SQL 2014
SQLSERVER: SQLSERVER: SQLSERVER:
Decode-SqlName Add-SqlAvailabilityDatabase Add-SqlAvailabilityDatabase
Encode-SqlName Add-SqlAvailabilityGroupListenerStaticIp Add-SqlAvailabilityGroupListenerStaticIp
Invoke-Sqlcmd Backup-SqlDatabase Add-SqlFirewallRule

Convert-UrnToPath Backup-SqlDatabase

Decode-SqlName Convert-UrnToPath

Disable-SqlAlwaysOn Decode-SqlName

Enable-SqlAlwaysOn Disable-SqlAlwaysOn

Encode-SqlName Enable-SqlAlwaysOn

Get-SqlCredential Encode-SqlName

Invoke-PolicyEvaluation Get-SqlCredential

Invoke-Sqlcmd Get-SqlDatabase

Join-SqlAvailabilityGroup Get-SqlInstance

New-SqlAvailabilityGroup Get-SqlSmartAdmin

New-SqlAvailabilityGroupListener Invoke-PolicyEvaluation

New-SqlAvailabilityReplica Invoke-Sqlcmd

New-SqlCredential Join-SqlAvailabilityGroup

New-SqlHADREndpoint New-SqlAvailabilityGroup

Remove-SqlAvailabilityDatabase New-SqlAvailabilityGroupListener

Remove-SqlAvailabilityGroup New-SqlAvailabilityReplica

Remove-SqlAvailabilityReplica New-SqlBackupEncryptionOption

Remove-SqlCredential New-SqlCredential

Restore-SqlDatabase New-SqlHADREndpoint

Resume-SqlAvailabilityDatabase Remove-SqlAvailabilityDatabase

Set-SqlAvailabilityGroup Remove-SqlAvailabilityGroup

Set-SqlAvailabilityGroupListener Remove-SqlAvailabilityReplica

Set-SqlAvailabilityReplica Remove-SqlCredential

Set-SqlCredential Remove-SqlFirewallRule

Set-SqlHADREndpoint Restore-SqlDatabase

Suspend-SqlAvailabilityDatabase Resume-SqlAvailabilityDatabase

Switch-SqlAvailabilityGroup Set-SqlAuthenticationMode

Test-SqlAvailabilityGroup Set-SqlAvailabilityGroup

Test-SqlAvailabilityReplica Set-SqlAvailabilityGroupListener

Test-SqlDatabaseReplicaState Set-SqlAvailabilityReplica


Set-SqlCredential


Set-SqlHADREndpoint


Set-SqlNetworkConfiguration


Set-SqlSmartAdmin


Start-SqlInstance


Stop-SqlInstance


Suspend-SqlAvailabilityDatabase


Switch-SqlAvailabilityGroup


Test-SqlAvailabilityGroup


Test-SqlAvailabilityReplica


Test-SqlDatabaseReplicaState


Test-SqlSmartAdmin


NOTE: This list was compiled from executing "Get-Command -Module SQLPS" (or similar) on the relevant versions.

In addition to this the SQLPS.exe Utility was introducted in SQL 2008 and is still available however it has been marked for depreciation in future releases. I have always recommended PowerShell peeps to use SQLPS Module which was introducted in SQL 2012 and still maintained today. The main reason that this has been my recommendation is that the SQLPS utility loads a PowerShell v2 shell and because I often use cmdlets other than just the SQL ones in my scripts I find it more efficient to develop for the latest release.



Monday, July 21, 2014

TSQL to suggest optimal MaxServerMemory value

A while ago a colleague of mine wrote a TSQL script to help with suggesting a MaxServerMemory value and a little less time ago we worked together to re-write that script for SQL 2012. While I haven't "checked" it against SQL 2014 the principals are the same, and the reason I am blogging it today is because I am always looking for the script or URL to direct people to so this post makes that easier :)

Of cause there is no black and white recommendation for MaxServerMemory as it is workload dependent. However this script can assist with understanding what that workload requirement is within a SQL Server Instance, please ensure you carefully consider all other memory needs of the server such as additional SQL Instances, and services such as AV, Backups, Monitoring, etc.


SET NOCOUNT ON;

DECLARE 
--@pg_size INT, 
@Instancename varchar(50),
--@RecMem int,
@MaxMem int,
@MaxRamServer int,
@sql varchar(max),
@SQLVersion tinyint


SELECT @SQLVersion = @@MicrosoftVersion / 0x01000000  -- Get major version

-- SELECT physical_memory_kb as ServerRAM_KB from sys.dm_os_sys_info
-- SELECT @pg_size = low from master..spt_values where number = 1 and type = 'E'
-- SELECT @Instancename = LEFT([object_name], (CHARINDEX(':',[object_name]))) FROM sys.dm_os_performance_counters WHERE counter_name = 'Buffer cache hit ratio'
PRINT '------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------'
PRINT 'Optimal MaxServermemory Setting for SQL Server instance ' + @@SERVERNAME  + ' (' + CAST(SERVERPROPERTY('productversion') AS VARCHAR) + ' - ' +  SUBSTRING(@@VERSION, CHARINDEX('X',@@VERSION),4)  + ' - ' + CAST(SERVERPROPERTY('edition') AS VARCHAR) + ')'
PRINT '------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------'

IF @SQLVersion = 11
BEGIN
 PRINT '------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------'
 PRINT 'Total Memory on the Server (MB)' 
 EXEC sp_executesql N'set @_MaxRamServer= (select physical_memory_kb/1024 from sys.dm_os_sys_info)', N'@_MaxRamServer int OUTPUT', @_MaxRamServer = @MaxRamServer OUTPUT
 Print @MaxRamServer
 PRINT '------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------'
END
ELSE
IF @SQLVersion in (10,9)
BEGIN
 PRINT '------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------'
 PRINT 'Total Memory on the Server (MB)' 
 EXEC sp_executesql N'set @_MaxRamServer= (select physical_memory_in_bytes/1024/1024 from sys.dm_os_sys_info)', N'@_MaxRamServer int OUTPUT', @_MaxRamServer = @MaxRamServer OUTPUT
 Print @MaxRamServer
 PRINT '------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------'
END
ELSE 
BEGIN
 PRINT 'Script only supports SQL Server 2005 or later.'
 RETURN
END

--SELECT @RecMem=physical_memory_kb/1024 from sys.dm_os_sys_info

SET @MaxMem = CASE 
    WHEN @MaxRamServer < = 1024*2 THEN @MaxRamServer - 512  /*When the RAM is Less than or equal to 2GB*/
    WHEN @MaxRamServer < = 1024*4 THEN @MaxRamServer - 1024 /*When the RAM is Less than or equal to 4GB*/
    WHEN @MaxRamServer < = 1024*16 THEN @MaxRamServer - 1024 - Ceiling((@MaxRamServer-4096) / (4.0*1024))*1024 /*When the RAM is Less than or equal to 16GB*/

 -- My machines memory calculation
 -- RAM= 16GB
 -- Case 3 as above:- 16384 RAM-> MaxMem= 16384-1024-[(16384-4096)/4096] *1024
 -- MaxMem= 12106

    WHEN @MaxRamServer > 1024*16 THEN @MaxRamServer - 4096 - Ceiling((@MaxRamServer-1024*16) / (8.0*1024))*1024 /*When the RAM is Greater than or equal to 16GB*/
     END
 SET @sql='
EXEC sp_configure ''Show Advanced Options'',1;
RECONFIGURE WITH OVERRIDE;
EXEC sp_configure ''max server memory'','+CONVERT(VARCHAR(6), @maxMem)+';
RECONFIGURE WITH OVERRIDE;'

PRINT '------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------'
PRINT 'Optimal MaxServerMemory Setting for this instance of SQL' 
Print (@sql) 
PRINT '------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------'






Ping back to Tara S. Jana's site http://sqlmasters.jimdo.com/scripts-tsql


Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Tuesday, July 15, 2014

Demo preparation - Removing unwanted Schema's from AdventureWorks

Those of us who's roles require them to conduct demonstrations and presentations of SQL Server regularly will be familiar with the AdventureWorks databases. I must admit this has been my go to demo database for a while now. However some times the whole schema is just too much and all I really want is a subset of the schema for a demonstration.

The problem is because of the number of Constraints, Foreign Keys, and Views in the AdventureWorks schema this is no simple task (as it shouldn't be). I was faced with this issue today and so have created the following statements to help with generating the drop statements. These are based on the schema's that the objects belong to. For my needs I only wanted the HumanResources and Person schemas.

Important... backup first if you care about your data and..... DO NOT DO THIS IN PRODUCTION.....


--Constraints
SELECT sys.schemas.name as 'Schema'
 , sys.tables.name as 'Table'
 , sys.check_constraints.name as 'Constraint'
 , 'ALTER TABLE [' + sys.schemas.name + '].[' + sys.tables.name + '] DROP CONSTRAINT [' + sys.check_constraints.name + ']' AS 'Drop Stmt'
FROM sys.check_constraints
INNER JOIN sys.tables ON sys.tables.object_id = sys.check_constraints.parent_object_id
INNER JOIN sys.schemas ON sys.schemas.schema_id = sys.tables.schema_id
 AND sys.schemas.name <> 'HumanResources'
 AND sys.schemas.name <> 'Person'

 
--Foreign Keys
SELECT sys.schemas.name as 'Schema'
 , sys.tables.name as 'Table'
 , sys.foreign_keys.name as 'ForeignKey'
 , 'ALTER TABLE [' + sys.schemas.name + '].[' + sys.tables.name + '] DROP CONSTRAINT [' + sys.foreign_keys.name + ']' AS 'Drop Stmt'
FROM sys.foreign_keys
INNER JOIN sys.tables ON sys.tables.object_id = sys.foreign_keys.parent_object_id
INNER JOIN sys.schemas ON sys.schemas.schema_id = sys.tables.schema_id
 AND sys.schemas.name <> 'HumanResources'
 AND sys.schemas.name <> 'Person'


--Views
SELECT sys.schemas.name as 'Schema'
 , sys.views.name as 'View'
 , 'DROP VIEW [' + sys.schemas.name + '].[' + sys.views.name + ']' AS 'Drop Stmt'
FROM sys.views
INNER JOIN sys.schemas ON sys.schemas.schema_id = sys.views.schema_id
 AND sys.schemas.name <> 'HumanResources'
 AND sys.schemas.name <> 'Person'


--Tables
SELECT sys.schemas.name as 'Schema'
 , sys.tables.name as 'Table'
 , 'DROP TABLE ' + sys.schemas.name + '.' + sys.tables.name  AS 'Drop Stmt'
FROM sys.tables 
INNER JOIN sys.schemas ON sys.schemas.schema_id = sys.tables.schema_id
 AND sys.schemas.name <> 'HumanResources'
 AND sys.schemas.name <> 'Person'
ORDER BY sys.schemas.name, sys.tables.name 




Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Monday, July 7, 2014

Ready, Set, Go.....Format, Encrypt, and Prepare a Removable USB Drive using BitLocker and PowerShell

In my line of work protecting customer data is extremely important, and with that in mind unfortunately some times there is just no way to retrieve data for analysis without using a USB Drive. This presents some important requirements for any removable drive used in this way and for me that includes regularly performing three main tasks:
  1. Format the drive (erase any previous customers data)
  2. Encrypt the drive
  3. Copy any tools etc back onto the drive
So recently I have been working on automating this with PowerShell, through the use of the BitLocker and Storage modules.

Introducing the USB Drive Wipe Prepare project.

What you will need:
  • PowerShell 3+
  • BitLocker Module (recommend at least Windows 8, Windows 2012)

While the complete script can be found here I will focus this post on the challenges I faced with building such solution so that if you are working on a similar project you can benefit from my hard work :)

The most challenging aspect of this was working with Bitlocker CmdLets, so they are my main focus.

The first step is to format the drive with Format-Volume.
$Result = Format-Volume -ObjectId $($Volume.ObjectId) -FileSystem $($Volume.FileSystem) -NewFileSystemLabel $($Volume.FileSystemLabel);

The next step is to encrypt the volume with BitLocker. This involves a number of steps.

  1. Firstly, due to my employers GPO setting (and a best practice) I must add a recovery password key to the drive.
    $Result = Add-BitLockerKeyProtector -MountPoint "$($Volume.DriveLetter):" -RecoveryPasswordProtector
    
    
  2. As part of this it is best practice to then make sure you have the Recovery Key saved off to a location. Earlier in the script I create a PSDrive to reference this location and simplify scripting.
    "Bitlocker Key for $($Volume.FileSystemLabel)`r`n `
    Identifier: $((Get-BitLockerVolume "$($Volume.DriveLetter):").KeyProtector.KeyProtectorId)`r`n `Key: $((Get-BitLockerVolume "$($Volume.DriveLetter):").KeyProtector.RecoveryPassword)" | Out-File -FilePath "BitLockerKeys:\$($Volume.FileSystemLabel).BitLockerKey.txt";
    
    
  3. Next I enable BitLocker on the Removable Drive with a Password (effectively using BitLocker2Go)
    $Result = Enable-BitLocker -MountPoint "$($Volume.DriveLetter):" -EncryptionMethod Aes256 -UsedSpaceOnly -Password $BitLockerPassword -PasswordProtector;
    
    
  4. As the encryption process can take some time the next part of my script checks the status of the protection with
    while ((Get-BitLockerVolume -MountPoint "$($Volume.DriveLetter):").EncryptionPercentage -lt 100)
    ....
    
    
After encrypting the drive my script then copies files/folders which I have stored in a common path on my laptop for use on most customer engagements. This is a specific need for my line of work however the functionality could be used for anything. I retrieve the path from a XML configuration file during the Begin block of the script, and if that file doesn't exist then it is created. The user can also supply a "-Setup" switch parameter to force the script to prompt for the configuration settings and rebuild the config XML file.

This script is provided "as is" however should you be performing similar operations around Encrypting removable drives this may help you towards your solution.

As mentioned above the complete script can be found on the CodePlex project https://usbdrivepreptool.codeplex.com/




Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.
 

Friday, July 4, 2014

Format Operator vs Format Parameter for Date Time

So most of my PowerShell scripts include logging or export functionality which requires the current date/time to be in a particular format. So this got me thinking, just which method actually performs better.

Get-Date Format Parameter

Command:
Get-Date -Format {yyyyMMddhhmmss}


Individual Times (ms):  0.9507, 0.3538, 0.3797, 0.2996, 0.2996, 0.3575, 0.1991, 0.2011, 0.1297, 0.1625
Overall Avg (ms): 0.33333

Format Operator

Command:
"{0:yyyy}{0:MM}{0:dd}{0:hh}{0:mm}{0:ss}" -f (Get-Date)

Individual Times (ms):  1.6881, 0.4051, 1.2866, 0.7377, 0.1761, 0.4794, 0.9889, 0.7307, 0.5377, 0.8777
Overall Avg (ms): 0.7908


And the winner is....... Get-Date with the Format parameter, most likely because the formatting is conducted early in the stack, (i.e. closer to the .Net framework layer). However if you look at some of the individual results there isn't much in it at times.


The full script used to perform this test is as follows:

[array]$Results = @();

for ($i=0; $i -lt 10; $i++)
{
   $Results += Measure-Command {
      Get-Date -Format {yyyyMMddhhmmss}
   } | select TotalMilliseconds;
   #helps give real results by letting the system pause for a moment
   Start-Sleep -Seconds 1;
}
Write-Host "Get-Date format paramter results"
$Results

[array]$Results = @();
for ($i=0; $i -lt 10; $i++)
{
   $Results += Measure-Command {
      "{0:yyyy}{0:MM}{0:dd}{0:hh}{0:mm}{0:ss}" -f (Get-Date)
   } | select TotalMilliseconds;
   #helps give real results by letting the system pause for a moment
   Start-Sleep -Seconds 1;
}
Write-Host "Format operator results"
$Results

Monday, June 30, 2014

PowerShell Amusements: Automated Speech

While setting up a lab environment today I had some time to kill and thought I would check out more recent methods for speech automation from within PowerShell. Some of the interesting aspects involved in this script are:

Get-Random - Used to select random members from the object
ArrayList object type - Used to allow the removal of array members


if ($VerbosePreferenceOrig.length -eq 0) {$VerbosePreferenceOrig = $VerbosePreference;}
$VerbosePreference = "Continue";
Add-Type -AssemblyName System.Speech 
$Speech = New-Object -TypeName System.Speech.Synthesis.SpeechSynthesizer
$Speech.Volume = 100;
#create temp copies to work with
$Voices = New-Object System.Collections.ArrayList;
$Words = New-Object System.Collections.ArrayList;

#fill the arrays which will be static
Foreach ($Item in $Speech.GetInstalledVoices().VoiceInfo.Name)
{
    $Idx = $Voices.Add($Item);
}
$Idx = $Words.Add("Hello, welcome to PowerShell 4.0 Part 1");
$Idx = $Words.Add("Did you know PowerShell rocks!");
$Idx = $Words.Add("What you are hearing, is written in PowerShell");
$Idx = $Words.Add("We are going to have some fun this week");
$Idx = $Words.Add("With dot net we can do a great deal");
$Idx = $Words.Add("Don't worry, use Get-Help");

#create temp copies to work with
$TmpVoices = New-Object System.Collections.ArrayList;
$TmpWords = New-Object System.Collections.ArrayList;

#fill the temporary arrays
foreach ($item in $Voices) {$Idx = $TmpVoices.Add($item);}
foreach ($item in $Words) {$Idx = $TmpWords.Add($item);}

For ($loop=0; $loop -lt 10; $loop++)
{
    Write-Verbose "Starting loop $loop";
    #make sure we don't completely empty the arrays
    if ($TmpVoices.Count -le 0) {foreach ($item in $Voices) {$Idx = $TmpVoices.Add($item);};}
    if ($TmpWords.Count -le 0) {foreach ($item in $Words) {$Idx = $TmpWords.Add($item);};}
    #randomly get the voice and words to say
    $Voice = Get-Random -InputObject $TmpVoices;
    $Word = Get-Random -InputObject $TmpWords;
    #remove these from the selection for less repetition (
    $TmpVoices.Remove($Voice);
    $TmpWords.Remove($Word);
    Write-Verbose "Voice: $($Voice)";
    Write-Verbose "Words: $($Word)";
    $Speech.SelectVoice($Voice);
    $Speech.Speak($Word);
    Start-Sleep -Seconds 3;
}
Write-Host "Script Done!";

$VerbosePreference = $VerbosePreferenceOrig;

Tuesday, June 17, 2014

Accidental feature discovery... RDP client

Today I accidentally stumbled across a feature of the latest RDP client (6.3.9600.16384).... typically when using multiple monitors (laptop + DVI monitor) I end up with different resolutions between the monitors and so when I establish a RDP session in full screen it defaults to my main monitor (laptop), so if i wanted it to be full screen on the 2nd monitor I would have to manually set the resolution before connecting.

Well today when I connected in full screen on my main monitor (laptop) I thought "oh well i will live with a smaller window on my 2nd screen" and dragged the RDP window to the 2nd screen which snaps to a smaller window. Then I hit the maximise button to return it to the resolution I connected in and to my amazement it went full screen. And just to prove this was changing resolutions on me I dragged it back to the main monitor and maximised it, and then again to my 2nd monitor.

So it would seem the RDP client is finally able to adjust the resolutions in full screen mode. I had hoped this wasn't far away with the "smart sizing" feature that was introduced a little while ago.