Sunday, August 21, 2016

New SQL Server Builds API.... improve your health checks with version and support information

Things have been a bit busy for me personally over the last few months but that is another story. What I have done is use some of this time to get back to a little project I started a while back.

Introducing the SQL Server Builds API..... providing version information including release dates, and known support life-cycle information.


Overview
The SQL Server Builds API provides SQL Server Version information and known support life-cycle information. As it is a personal project of mine the data is currently manually maintained based on known sources. The main objective of the tool is to enhance health check and project planning type processes, it is not intended for licensing or other legal auditing processes.

In prior roles and particularly in my current role one of the most common checks I perform on a SQL Server is the version and also the patching process. To aid in this what I would do is run reports based off my own custom database of version numbers to check and validate the version of the instance being checked.


Querying the API
To query the API it is pretty simple. The base URI is http://sqlserverbuildsapi.azurewebsites.net/api/SQLVersions

The API takes two parameters

  • Version - This is a string representation of the version of your SQL Instance returned by SELECT SERVERPROPERTY('PRODUCTVERSION') e.g. '12.00.2342.00'
  • MissingUpdates - A boolean value (True/False) of if to include any updates released since the current patch level in the results

So if I wanted to query for information about the version number 12.00.2342.00 my completed URI would be something like:

http://sqlserverbuildsapi.azurewebsites.net/api/SQLVersions?Version=12.00.2342.00&MissingUpdates=false

NOTE: The MissingUpdates param defaults to false so you could omit it from the URI above.


Where the data comes from
The official source of truth with life-cycle data has to be Microsoft Support Life-cycle (https://support.microsoft.com/lifecycle), and this API/tool is by no means trying to replace that. In fact if you are doing any official auditing for licensing or any other legal means you MUST use that source over this API's data. At least perform your own validation of the data for each record.

Most DBA's and SQL admins would refer to http://sqlserverbuilds.blogspot.com.au. I am no different and I would previously refer to that site as well. So, it is only natural that it is still one of the sources I go to for building my API's database.

The database behind this API is currently manually maintained, and even if it was somewhat automated mistakes can happen. If you find I am missing any updates, or if there is any incorrect data please make sure you reach out and let me know.


Suggestions for using the API
The API could be used in any number of tools and checks. Here are just some I can think of:
  • Custom health check scripts
  • Enhance on-premise CMDB data
  • PowerBI reports
  • Other custom Applications that interface/manage SQL Server instances

PowerShell Examples

#1 - Get version information (static version number)
# Set the version number to test
$VersionNumber = '12.00.2342.00' # SQL 2014

# Call the API to get the version information
$VersionData = Invoke-RestMethod -Uri "http://sqlserverbuildsapi.azurewebsites.net/api/SQLVersions?Version=$($VersionNumber)";

# at this point simply return the $VersionData variable for the raw info
$VersionData

# Want to improve your Health Check script, calculate the health of Support Status
if ($VersionData.ExtendedSupportEnd -le (Get-Date)) {$SupportStatus = 'Critical'}
elseif ($VersionData.MainstreamSupportEnd -le (Get-Date)) {$SupportStatus = 'Warning'}
else {$SupportStatus = 'Ok'}

# format the output data string
$OutputData = @"
Instance = $($Instance.Name)
Version = $($VersionData.Version)
Product = $($VersionData.Product)
Branch = $($VersionData.Branch)
Update = $($VersionData.Update)
MainstreamSupportEnd = $($VersionData.MainstreamSupportEnd)
ExtendedSupportEnd = $($VersionData.ExtendedSupportEnd)
SupportStatus = $($SupportStatus)
"@

# Return the hashtable
ConvertFrom-StringData -StringData $OutputData;


#2 - Get Missing updates (static version number)
# Set the version number to test
$VersionNumber = '11.00.2342.00' # SQL 2012

# Call the API to get the known missing updates (released since current version)
$VersionData = Invoke-RestMethod -Uri "http://sqlserverbuildsapi.azurewebsites.net:80/api/SQLVersions?Version=$($VersionNumber)&MissingUpdates=$true";

# return the output, it is already formated as an Array
$VersionData

# or as a table
$VersionData | Format-Table -Property Version,Branch,Update,@{Name="Released";Expression={Get-Date $_.Released -Format 'yyyy-MM-dd'}},Description -AutoSize


Other Examples
There are more examples in my GitHub repository - https://github.com/Matticusau/SQLServerBuildsAPI-Examples

Including:

  • Simple static value examples like above
  • Both Single and Multiple Instance examples with live version number data direct from a SQL Instance
  • c# Windows Form client
Just want to see it in action, I have an MVC form available at https://sqlserverbuildsportal.azurewebsites.net


Where is the source code for the API
At this time I am not making the API opensource.


Want to partner with me
This is a personal project. If you have a similar project or a project that you would like to use this API in please reach out and lets improve the world together.

If you have any questions around this API please make sure you comment or reach out to me through any of my contact means.

Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.


Friday, April 15, 2016

TSQL To identify queries which are candidates for Parameterization

While discussing the concepts of 'optimize for ad hoc workloads' and 'Forced Parameterization' in SQL Server I decided to modify an existing query I wrote to analyse the query performance from the cache (more on it here) to now analyse the adhoc queries for candidates which could benefit from parameterization.

;WITH CTE(QueryCount, StatementTextForExample, plan_handle, QueyHash, QueryPlanHash, CacheObjType, 
    ObjType) 
AS 
( 
   SELECT
    COUNT(query_stats.query_hash) AS QueryCount
    , MIN(query_stats.query_text) AS StatementTextForExample
    , MIN(query_stats.plan_handle) AS plan_handle
    , query_stats.query_hash AS QueryHash
    , query_stats.query_plan_hash AS QueryPlanHash
    , query_stats.CacheObjType
    , query_stats.ObjType
   FROM  
       ( 
           SELECT  
      qs.query_hash
      , qs.query_plan_hash 
      , qs.plan_handle
      , cp.cacheobjtype as [CacheObjType]
      , cp.objtype as [ObjType]
      , SUBSTRING(qt.[text], qs.statement_start_offset/2, ( 
                   CASE  
                       WHEN qs.statement_end_offset = -1 THEN LEN(CONVERT(NVARCHAR(MAX), qt.[text])) * 2  
                       ELSE qs.statement_end_offset  
                   END - qs.statement_start_offset)/2  
               ) AS query_text 
           FROM  
               sys.dm_exec_query_stats AS qs 
      INNER JOIN sys.dm_exec_cached_plans cp ON cp.plan_handle = qs.plan_handle
               CROSS APPLY sys.dm_exec_sql_text(qs.[sql_handle]) AS qt 
           WHERE qt.[text] NOT LIKE '%sys.dm_exec_query_stats%' 
     AND cp.objtype = 'AdHoc'
     --AND qs.last_execution_time BETWEEN DATEADD(hour,-1,GETDATE()) AND GETDATE() --change hour time frame
       ) AS query_stats 
 GROUP BY query_stats.query_hash
  , query_stats.query_plan_hash
  , query_stats.CacheObjType
  , query_stats.ObjType
 HAVING COUNT(query_stats.query_hash) > 1
) 
SELECT  
   CTE.QueryCount
   , CTE.CacheObjType
   , CTE.ObjType
   , CTE.StatementTextForExample
   , tp.query_plan AS StatementPlan
   , CTE.QueyHash
   , CTE.QueryPlanHash 
FROM 
   CTE 
   OUTER APPLY sys.dm_exec_query_plan(CTE.plan_handle) AS tp 
ORDER BY CTE.QueryCount DESC; 
--ORDER BY [Total IO] DESC; 

When you identify these candidates you then need to look at what is the most appropriate resolution such as.

  1. Rewrite the query at the application layer to ensure it is called with parameterization from the data provider
  2. Rewrite the query as a stored procedure
  3. Enable 'optimize for ad hoc workloads' on your SQL instance
    EXEC sys.sp_configure N'optimize for ad hoc workloads', N'0'
    GO
    RECONFIGURE WITH OVERRIDE
    GO
    
  4. Enable 'Forced Parameterization' on the affected database
    USE [master]
    GO
    ALTER DATABASE [AdventureWorksPTO] SET PARAMETERIZATION SIMPLE WITH NO_WAIT
    GO
    
Of cause the appropriate option really depends on a larger view of your environment and applications that only you can determine. If you do have any thoughts on this please feel free to add them to the comments below to help others.

Word of caution too. Before adjusting any sp_configure settings with RECONFIGURE make sure you run the following to check for any settings which are not yet active. It is expected to see 'min server memory (MB)' in the results of this though if you leave the setting default at 0 as SQL must reserve the minimum memory possible which is 16mb.
SELECT * FROM sys.configurations
WHERE Value <> value_in_use


Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.





 

Wednesday, March 23, 2016

PowerShell: Finding commands (functions) with Risk Mitigation capabilities

Yesterday I was talking to a workshop about implementing Risk Mitigation capabilities into your own Advanced Functions in PowerShell. This got me thinking about how to identify the commands that have that capability enabled. More importantly the configuration of the ConfirmImpact setting.

Background:
If you don't know PowerShell uses a number of inputs when determining how the Risk Mitigation capabilities are applied.
For the WhatIf capability it is purely the presence of the -WhatIf switch parameter.
For the Confirm capability it is a combination of the ConfirmImpact value in the [CmdLetBinding()] attribute and the $ConfirmPreference variable value. Or the presence of -Confirm parameter

The following command we show all commands with the RiskMitigation capability.
Get-Command | Where-Object {$PsItem.Parameters.Keys.Count -gt 0 -and $PsItem.Parameters.ContainsKey('WhatIf')}


We can extend this further to retrieve the ConfirmImpact value however this is not exposed through a Parameter on the command object and can only be retrieved from the source definition. Unfortunately for CmdLets this is not exposed as they are compiled. This will work though for Functions so you can check your own commands etc.

Get-Command -CommandType Function | `
   Where-Object {$PsItem.Parameters.Keys.Count -gt 0 -and $PsItem.Parameters.ContainsKey('WhatIf')} | `
   Select-Object -Property Name,CommandType,@{Name="Impact";Expression={[Regex]::Match($PSItem.Definition, "(ConfirmImpact='(?<impact>.{1,})'{1})").Groups["Impact"].Value}}


This pipeline command uses the RegEx type accelerator and Match static member but similar could be achieved with the -Match operator.


Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Wednesday, March 2, 2016

PowerShell ISE add-on to toggle collapsible sections in all files

UPDATE 03/03/2016: It's amazing how quickly this went from a simple script to within an hour yesterday becoming a Module and now released on PowerShellGallery and GitHub. Check out the instructions here to install this.
---------------

One feature I am hoping to see in the PowerShell ISE is the state of collapsed sections to be maintained for "recent" files. It is just annoying when you have a very large file that you are regularly working on to have to collapse all the sections you had previously collapsed each time you open it up..... anyway you can vote on that idea here.

If you didn't know there is a keyboard shortcut CTRL+M which will toggle the collapsed sections for the current file. Handy little gem that one. We can also programmatically invoke this method with:

$psISE.CurrentFile.Editor.ToggleOutliningExpansion()

This got me thinking, I can easily write an function that itterates through each open Tab/File and collapses it. Even better I can wrap that up into an add-on for the ISE.

I've added a menu item and also the keyboard shortcut CTRL+SHIFT+M to implement this.

Save the following code into your Profile script and enjoy!

#requires -Version 4
<#
    Script:     ISEColapseAllFiles.ps1
    Author:     Matt Lavery
    Created:    02/03/2016
    Version:    0.0.1
    
    Change History
    Version    Who          When           What
    --------------------------------------------------------------------------------------------------
    0.0.1      MLavery      02/03/2016     Initial Coding
    
#>

<#  
    .SYNOPSIS
        Toggles the state of all expandable sections in all open files within the ISE

    .DESCRIPTION
        Toggles the state of all expandable sections in all open files within the ISE
        Works across PowerShell Tabs.
        Implements the keyboard shortcut CTRL+SHIFT+M and a menu item in the Add-ons menu.
#>
function Set-ISECollapseAllFiles
{
    [CmdletBinding()]
    Param()

    Foreach ($psISETab in $psISE.PowerShellTabs)
    {
        Write-Verbose "PS Tab: $($psISETab.DisplayName)";
        foreach ($psISEFile in $psISETab.Files)
        {
            Write-Verbose "PS File: $($psISEFile.DisplayName)";
            $psISEFile.Editor.ToggleOutliningExpansion()
        }
    }
}

# remove the existing menu item if it exists
if ($psISE.CurrentPowerShellTab.AddOnsMenu.Submenus.DisplayName.Contains('Toggle Colapse All Files'))
{
    $psISE.CurrentPowerShellTab.AddOnsMenu.Submenus.Remove(($psISE.CurrentPowerShellTab.AddOnsMenu.Submenus.GetEnumerator() | Where-Object DisplayName -EQ 'Toggle Colapse All Files'));
}
# add the add-on menu
$psISE.CurrentPowerShellTab.AddOnsMenu.Submenus.Add('Toggle Colapse All Files',{Set-ISECollapseAllFiles},"CTRL+SHIFT+M");



Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Thursday, February 25, 2016

Generating Certificates for Desired State Configuration lab/test environments

A colleague pointed out to me today that it has been a long time since I blogged, and you know what he is right as I have mostly been posting tweets about interesting news. Given I was taking some colleagues through setting up a Desired State Configuration lab environment and ran into an issue with the certificates we were trying to use, and to make it worse the issue is something I have faced in the past but forget due to time lag, this seems like a great topic to blog about.

Now a search in your favorite search engine would show that this is not an uncommon problem but there isn't many who answer it plain and simple.

Do not use New-SelfSignedCertificate to generate a certificate for testing DSC deployments (encrypting credentials) on Windows 2012 R2 or you will most likely receive the error:

    The private key could not be acquired.
        + CategoryInfo          : NotSpecified: (root/Microsoft/...gurationManager:String) [], Ci
       mException
        + FullyQualifiedErrorId : MI RESULT 1
        + PSComputerName        : localhost


Solution #1: Download and use the script/function from Script Center - https://gallery.technet.microsoft.com/scriptcenter/Self-signed-certificate-5920a7c6
 

Solution #2: Generate the certificate with MakeCert.exe from the Visual Studio Command Prompt.

For example, to make a certificate called DSCDemo.

    makecert.exe -r -pe -n "CN=DSCDemo" -sky exchange -ss my -sr localMachine
Further info on makecert can be found at https://msdn.microsoft.com/library/bfsktky3%28v=vs.100%29.aspx

The background, I was trying to setup a DSCPull server with configuration for installing a SQL Server Instance using xSqlPs. This required a Credential to be stored in the config for the 'sa' account, which clearly needed to be encrypted as I didn't want to use plain text passwords in the DSC Configuration. We tried all sorts of methods for exporting the certificate from the DSCPull but eventually the clue was in the fact that even on the DSCPull server the certificate reported no Private Key data in the properties.


    PS C:\Users\mattl> get-item Cert:\LocalMachine\my\{thumbprint} | fl *
    ...

    Extensions               : {System.Security.Cryptography.Oid, System.Security.Cryptography.Oid,
                               System.Security.Cryptography.Oid, System.Security.Cryptography.Oid}
    FriendlyName             : DSCDemo
    IssuerName               : System.Security.Cryptography.X509Certificates.X500DistinguishedName
    NotAfter                 : 2/25/2017 4:02:31 AM
    NotBefore                : 2/25/2016 3:42:31 AM
    HasPrivateKey            : True
    PrivateKey               :
    PublicKey                : System.Security.Cryptography.X509Certificates.PublicKey
    ...

    Issuer                   : CN=DSCDemo
    Subject                  : CN=DSCDemo

Again the clue here was the fact that the Private Key isn't even displayed on the server where it was generated, so we know there is nothing wrong with the export/import, but actually a problem with the way it was generated.

A couple of searches and I found the blog post I used last time I faced this issue, yep isn't that annoying. Turns out this is a problem with the New-SelfSignedCertificate cmdlet and when you generate the certificate with MakeCert.exe as per above the Private Key data is visible on the server it is generated on and also on the server it is imported on (as long as you export the private key data too).

Hope this helps a few others, or at least helps me remember next time I face this problem before I waste a few hours trying to figure it out again ;)
 

BTW in other news did you see the post that WMF 5.0 (aka PowerShell v5) RTM has been re-released. Happy days.

Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Thursday, October 29, 2015

Latest SSMS stream lines the process for adding Azure Firewall Rules at connection

If you haven't heard, Microsoft have broken the release of SQL Management Studio away from the main product. While at this stage it is continuing to be released at the same cadence as the SQL 2016 CTP 2 cycle (e.g. Monthly) the updates are targeting features for existing releases, Azure SQL, and of cause SQL 2016. You can get the latest release @ http://aka.ms/ssms.

I have been working on a project of late with an Azure SQL DB back-end and one thing that has always frustrated me given my role keeps me out in the field is that I am constantly having to log into the portal and update the firewall rules....... Well the latest release of SSMS makes this process much easier.

Now when you try and connect to an Azure SQL Server and a firewall rule on the server blocks your connection, you are no longer prompted with the message informing you that the client is not allowed to connect. Now you are prompted with a dialog asking you to log into Azure and create a firewall rule. It even populates the client IP Address for you and suggests an IP range.


One you authenticate, select the appropriate option for the Firewall Rule (e.g. static or IP range). Click OK and SSMS will place a call to the Azure web services to add the firewall rule.


Once the rule is added, the authentication process continues which is a nice touch because I was half expecting to be prompted to log in again.

 
 Obviously for this to work the user then needs the required permissions on the subscription

This is a great example of how the team is actively working on improving the toolsets and responding to feedback so make sure you keep the connect items rolling in.


In other news, I am presenting with a colleague at MS Ignite Australia next month on some of the new toolset features in SQL 2016. If your attending make sure you attend and come say hi as I will be around the Data Den and Exhibition hall throughout the event.

https://msftignite.com.au/sessions/session-details/1524


Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Tuesday, June 30, 2015

SCOM: Sql Server Management Pack 6.6.0.0 released

The SQL Server Management Pack has been refreshed with all new dashboards, which look to be fully extensible and designed for larger server farms. Full details on how to configure the dashboards in the SQLServerDashboards.docx guide provided at the download links.

The new dashboards do look sexy though with these screen shots taken from the guides.





Version 6.6.0.0 can be downloaded at

SQL 2005-2012: https://www.microsoft.com/en-us/download/details.aspx?id=10631
SQL 2014: http://www.microsoft.com/en-us/download/details.aspx?id=42573

Again check out the specific guides provided on implementing these dashboards.


Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.