Tuesday, November 5, 2013

RML now supports SQL 2012 (and SQL 2014)

If you use the Replay Markup Language (RML) tool set then like me you will have been frustrated by the fact that it hadn't been updated to work with trace files captured from SQL 2012. You could probably assume that this was because the SQL Profiler Trace feature for purpose of trace capture (data engine only) is flagged for depreciation in future released (see http://technet.microsoft.com/en-us/library/ms181091.aspx).

Thankfully last week a new version was released which fully support SQL 2012 and even SQL 2014 CTP2 trace file definitions. I have just gone through the process of uninstalling my RML tools, installing the new version, and straight away I can process the trace files captured on a SQL 2012 instance. Hurray!

Be warned though you cannot use the new version of the Reporter tool to access any databases for previously processed trace data. I typically use this tool for performance analysis so I might keep the analysis databases around for a few weeks, so this just means I have to reprocess the trace files to generate reports compatible with the new version. Still worth it to get the benefit of the new supportability.


Description of the Replay Markup Language (RML) Utilities for SQL Server
http://support.microsoft.com/kb/944837

If you find that you cannot click on the navigational hyperlinks on the Main Performance Overview report (e.g. "Unique Batches", "Unique Statements", "Interesting Events", etc) then you will need to install the hotfix or ReportViewer.

Update fixes several Report Viewer issues after you install Visual Studio 2010 Service Pack 1
http://support.microsoft.com/kb/2549864
.....Or go directly to the download location at https://connect.microsoft.com/VisualStudio/Downloads/DownloadDetails.aspx?DownloadID=37409


Parsing SQL ERRORLOG's for errors with PowerShell

I've been finding the need to parse / grep the SQL Server ERRORLOG of late to look for any Errors which have occurred and for that I have been turning to my trusted friend PowerShell.

The cmdlet that gives us all the power here is Select-String

I will admit the hardest part about the use of the Select-String cmdlet for me was learning RegEx, but once you have the basics it is very powerful and this approach could be used for any number of applications.

So at the very basic level the following command can be used to report the lines with errors from the latest SQL ERRORLOG

Select-String -Path .\ERRORLOG -Pattern "(error:)\s\d{1,6}"

You could even use a wildcard in the path parameter to look for historical files, but for my need I was only interested in the current file.

Now that is all good and well but SQL Server doesn't record the description of the error on the same line as the error number, so I needed a way to pull the next line from the file as well. For this I used the powershell pipeline to then loop through each of the matched objects returned by Select-String using the Foreach-Object cmdlet and within the scriptblock of that cmdlet I first output the line we retrieved with Select-String, and then use Get-Content to pull in the details from the file and retrieve only the line number I am interested in (remember the Get-Content is a zero based array so I don't have to do anything with the line number from the Select-String matched object to get the next line.... think about it and you will understand).

Select-String -Path .\ERRORLOG.1 -Pattern "(error:)\s\d{1,6}" | Foreach-object {Write-Host "$($_.filename):$($_.linenumber):$($_.line)"; Write-Host "$($_.filename):$($_.linenumber+1):$((Get-Content "$($_.filename)")[$_.linenumber])"}

The performance of this one-line approach is not the greatest because it has to run the Get-Content cmdlet for each matched line. So if you are looking for speed this two-line approach has you covered there. The only downside is that if you change the file name you are looking at then you have to update this twice, where as the one line approach passes that value along the pipeline.

$filedata = (Get-Content .\ERRORLOG)
Select-String -Path .\ERRORLOG -Pattern "(error:)\s\d{1,6}" | Foreach-object {Write-Host "$($_.filename):$($_.linenumber):$($_.line)"; Write-Host "$($_.filename):$($_.linenumber+1):$($filedata[$_.linenumber])"}


Now if you were processing multiple files then you could do something like this multiline or scripted version.

$filelist = Get-ChildItem .\ERRORLOG*;
Foreach ($file in $filelist)
{
$filedata = (Get-Content $file.Name);
Select-String -Path $file.Name -Pattern "(error:)\s\d{1,6}" | Foreach-object {Write-Host "$($_.filename):$($_.linenumber):$($_.line)"; Write-Host "$($_.filename):$($_.linenumber+1):$($filedata[$_.linenumber])"}
} 

Of if you want to execute that from the powershell console then the two-line approach would be:
$filelist = Get-ChildItem .\ERRORLOG*;
Foreach ($file in $filelist){$filedata = (Get-Content $file.Name);Select-String -Path $file.Name -Pattern "(error:)\s\d{1,6}" | Foreach-object {Write-Host "$($_.filename):$($_.linenumber):$($_.line)"; Write-Host "$($_.filename):$($_.linenumber+1):$($filedata[$_.linenumber])"}}


Of if you are looking to summarize the number of errors occuring from all of the ERRORLOG files then the following provides an insight into that

Select-String -Path ERRORLOG* -Pattern "(error:)\s\d{1,6}" | Group-Object -Property {$_.Line.Substring(0,10)},{$_.Matches.Value} | Sort-Object Name


Happy investigations :)

If you are looking for a RegEx reference here are the ones I use:
http://www.regular-expressions.info/


Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Thursday, October 24, 2013

My Windows 8.1 Post Install Tips

After updating my Surface RT device and laptop to Windows 8.1 I thought I would share my experience. I will note that with both I chose to do a clean install.

Overall Experience
Firstly my overall experience has been exceptional. For the Surface RT install I simply accessed the update from the store using the direct link, took a bit for the update to start to download but I believe that was the Store app in 8.1 Preview causing that, between 1 to 2 hours later I was running Windows 8.1. Signed in with my Microsoft Account and all my settings were sync'd, then installed my apps from the store which was easy because of the new view which can filter to show only "your apps not installed on this pc".

For my laptop it was another painless install. Probably enhanced because of the deployment method adopted by Microsoft IT but still it was exceptional to see what can be achieved when deployment is managed in an efficient method. Anyway 1 hour later I was running Windows 8.1 and restoring my documents from my backup (for all the fun that went with that see my other post).

My Post Install Recommendations
These are my tips to complete once you have upgraded/installed Windows 8.1
  1. Link your Microsoft Account with your domain/local account if you didn't sign in with that account. This will give you the full experience with synchronization across your devices.
  2. Set PowerShell as the default for the Win+X menu over Command Prompt. If you are like me a frequent user of PowerShell then this will save plenty of time. And if you aren't using PowerShell yet, well get with the times you can still use native windows commands the same as you would in the command prompt.
    The following is where this setting is controlled in the TaskBar Properties

    Then this is the result, on the left is the default view, on the right is the view with PowerShell. This mention is accessed either by right clicking the Windows Logo in the bottom left corner or pressing Windows Key (Win) + X.
  3. Adjust the Power Settings for what the power buttons do. I like to take advantage of the fast hibernate feature in Windows 8 as I am often on the road, so I pick that over Sleep for what the power button does.

I will continue to add to this blog as I find more recommendations and tips.

Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Windows 7 File Recovery feature removed from Windows 8.1

As you "should" be aware keeping a backup of your important documents and files is critical to managing a PC, Server, Device. For me I have achieved this through the Windows 7 File Recovery feature in both Windows 7 and Windows 8 with an external hard drive or network storage as the backup destination.

In Windows 8 the File History feature was brought in to replace the Windows 7 File Recovery (aka Backup and Restore) feature. Well in Windows 8.1 this feature has now been removed.

So if like me you have used that feature to perform your pre-upgrade backup with the plan to then restore the files to your new Windows 8.1 system, then sadly it is not as simple as it use to be. However it is possible and here are the steps that I performed.

  1. Create a VM on your computer using the Hyper-V feature and install the Windows 7 or Windows 8 OS (but not 8.1)
    Alternatively if you have a Windows 7 or Windows 8 computer just use that. I was fortunate that I had a Windows 7 VM already built and just imported that into my laptop.
  2. If the backup is on an External HDD then attached the External Drive containing the backup to the computer. If using a Windows 7 VM then you will need to set the drive to offline before you can attach it to the VM.
  3. Start the VM (if required)
  4. Open "Backup and Recovery" on Windows 7 or "Windows 7 File Recovery" on Windows 8.
  5. Use the wizard to restore files from an existing backup and locate the backup on your external drive or network location.
  6. Select the files/folders that you wish to recover
  7. Now the important part, set the restore location to a sub folder on the same External HDD (ensure you have enough space). If you aren't using a VM you might be able to restore directly to your new Windows 8.1 computer over the network but I was using a VM without a network adapter attached.
  8. Once the restore completes, detach the external drive from the VM or other computer. Shut down the VM as it is no longer required (if appropriate)
  9. Attach the External HDD to the Windows 8.1 computer and manually copy all the files to the required locations.
  10. Finally setup Windows File History so you have backups of your documents again in a format that is supported by Windows 8.1 and easily recover files in the future.


Of cause if you are using the Windows 8 File History feature then ignore all of this as that feature is still current in Windows 8.1 and I will be updating all of my backups to use that from here on out :)

Monday, October 21, 2013

Tips for securing xp_cmdshell (when you really really have to use it)

If you have ever read anything about securing SQL server one of the most common threats that is called out is the use of xp_cmdshell. While it is generally accepted that this is a bad thing to enable within a production environment, there are some ligament cases where business, application, or probably more accurately legacy processes, require the use of xp_cmdshell. When we do enable this feature it is important to ensure that security is strictly locked down to prevent unwanted access to sensitive areas on the server or greater still, malicious actions on local and network components.

I was recently asked what my recommendations would be for securely implementing xp_cmdshell, and in my experience here are the steps to perform this:

Step 1
By default xp_cmdshell will execute under the context of the SQL Server service account, therefore the first step in reducing any risk is ensure that the account used for the SQL Server service is aligned to best practises. Such recommendations include the use of an account which:
    - Is not a local or domain administrator
    - Is a domain account where network resources may be required by aspects of the SQL environment (e.g. copying backup files to a network path)
    - Has minimal local and domain privileges
    - Has been configured using the SQL Server Configuration Manager

To completely secure the environment a separate account should also be used for the SQL Agent service with the same recommendations.

Step 2
Ensure that only the required users are members of the SQL Server SysAdmin server level role. Any members of this role will be able to execute xp_cmdshell and therefore allowed access to all aspects of the server that the SQL Service Account can access.

Step 3
Create a xp_cmdshell proxy account following the instructions at http://technet.microsoft.com/en-us/library/ms175046.aspx
This proxy account should be a unique domain user and separate to the SQL Server Service Account. It will be used when non-sysadmin SQL Logins execute xp_cmdshell and therefore an even restricted Access Control List (ACL) can be configured on the SQL Server and Network resources for that specific account. This will assist in significantly reducing the footprint area which is vulnerable to threat by xp_cmdshell.

Step 4
Grant the required permissions to specific non-sysadmin SQL Logins who require the ability to execute xp_cmdshell using the syntax: GRANT exec ON xp_cmdshell TO ''
This will ensure that only the SQL Logins which you have configured as either members of the SysAdmins built-in role, or explicitly granted execute permissions will be able to access the system via xp_cmdshell.
To view which SQL Logins have been granted permissions for xp_cmdshell run the following TSQL:












USE master;
GO
SELECT sys.schemas.name AS [schema_name]
, AllObjects.name AS [object_name]
, sys.database_permissions.permission_name
, sys.database_permissions.state_desc
, sys.database_principals.name AS [granted_to_principal_name]
FROM sys.database_permissions
INNER JOIN sys.database_principals ON sys.database_principals.principal_id = sys.database_permissions.grantee_principal_id
INNER JOIN (
        SELECT name, object_id, principal_id, schema_id, parent_object_id, type, type_desc, create_date, modify_date, is_ms_shipped, is_published, is_schema_published
        FROM sys.objects
        UNION
        SELECT name, object_id, principal_id, schema_id, parent_object_id, type, type_desc, create_date, modify_date, is_ms_shipped, is_published, is_schema_published
        FROM sys.system_objects
) AllObjects ON AllObjects.object_id = sys.database_permissions.major_id
LEFT JOIN sys.schemas ON sys.schemas.schema_id = AllObjects.schema_id
WHERE sys.schemas.name = 'sys'
AND AllObjects.name = 'xp_cmdshell'
ORDER BY sys.schemas.name
, AllObjects.name
, sys.database_principals.name
, sys.database_permissions.class;

 
Final Step
Start reviewing the processes, code, and reasons for using xp_cmdshell to determine if a more secure method could be used (e.g. PowerShell) to achieve the same outcome. Some might argue this should be the first step, but lets be realistic, you cannot always change an applications behavior especially if it is provided by a 3rd party so while this step definitely needs to be performed it may be the most difficult and longest of them all.



Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Friday, August 30, 2013

Analytics for Twitter 2013 in the lead up to MS TechEd Australia

TechEd Australia is just around the corner and it's always an interesting time to dive into the BI features of Excel, especially with the out of the box Analytics for Twitter 2013. Definitely going to be watching this over the course of next week and using this to get some interesting insights into my own session (but more on that later).

If you haven't already seen the analytic tool here is a current view of the analysis of the query "auteched".


You can get some interesting info already like the fact that most activity is happening around lunch time, which is probably expected but will definitely be interesting to trend this over the course of next week.



My favorite is the geospatial data as it makes for some very interesting and easy to digest visualization.


And an interesting race at the moment is between the iOS and Windows devices (I know which one I hope wins) but still they are both outdone by the Web (and whatever other is).



Ahhhh BI, we can only make guesses without you


Wednesday, July 10, 2013

TSQL to find Top Queries by Avg CPU, IO, Time

One of my favourite performance tools has certainly become the RML reports as you can quickly identify queries with common hot spots across performance metrics. However this requires a bit of work, you first have to configure your performance trace (usually with Diag Manager), capture the SQL Profiler Trace, and then analyse it with RML/SQLNexus. Of cause the UI with XEvents makes this a bit easier if you use that as your capture. However what if you want to look at a general overall performance of a server, or for a longer duration.
We know that SQL Server is always capturing data for us in the background and exposes that to us with DMVs and DMFs. So I came up with the following query which uses the DMVs/DMFs to report the top queries.
You can change the sort order at the end of the query however as I have included the Rank number per performance metric (CPU, IO, Time) you can quickly see queries which are in the top of each group without needing to resort the results or run multiple queries.
If you did want to run multiple queries with different sort columns, then the Query Number should remain the same as it is based on CPU and the reason i included this is to allow you to easily compare the queries across multiple executions (depending on the time between executions).
;WITH CTE([Query Num], [Total Executions], [Total CPU], [Avg CPU], [Avg CPU Rank], [Total IO], [Avg IO], [Avg IO Rank], [Total Elapsed Time], [Avg Elapsed Time], [Avg Elapsed Time Rank], [Sample Statement Text], [Query Plan Handle], [Statement Hash], [Query Plan Hash])
AS
(
    SELECT TOP 50
        ROW_NUMBER() OVER (ORDER BY (SUM(total_worker_time) / SUM(execution_count)) DESC) AS [Query Num]        , SUM(execution_count) AS [Total Executions]        , SUM(total_worker_time) AS [Total CPU]        , SUM(total_worker_time) / SUM(execution_count) AS [Avg CPU]        , RANK() OVER (ORDER BY (SUM(total_worker_time) / SUM(execution_count)) DESC) AS [Avg CPU Rank]        , SUM(total_physical_reads + total_logical_reads + total_logical_writes) AS [Total IO]        , SUM(total_physical_reads + total_logical_reads + total_logical_writes) / SUM(execution_count) AS [Avg IO]        , RANK() OVER (ORDER BY (SUM(total_physical_reads + total_logical_reads + total_logical_writes) / SUM(execution_count)) DESC) AS [Avg IO Rank]        , SUM(total_elapsed_time) AS [Total Elapsed Time]
        , SUM(total_elapsed_time) / SUM(execution_count) AS [Avg Elapsed Time]        , RANK() OVER (ORDER BY (SUM(total_elapsed_time) / SUM(execution_count)) DESC) AS [Avg Elapsed Time Rank]
        , MIN(query_text) AS [Sample Statement Text]        , MIN(plan_handle) AS [Query Plan Handle]        , query_hash AS [Statement Hash]        , query_plan_hash AS [Query Plan Hash]
    FROM 
    (
        SELECT 
            qs.*
            SUBSTRING(st.[text], qs.statement_start_offset/2, (
                CASE 
                    WHEN qs.statement_end_offset = -1 THEN LEN(CONVERT(NVARCHAR(MAX), st.[text])) * 2 
                    ELSE qs.statement_end_offset                  END - qs.statement_start_offset)/2 
            ) AS query_text
        FROM sys.dm_exec_query_stats AS qs
        CROSS APPLY sys.dm_exec_sql_text(qs.[sql_handle]) AS st
        WHERE st.[text] NOT LIKE '%sys.dm_%'
        --AND DateDiff(hour, last_execution_time, getdate()) < 1 --change hour time frame
    ) AS query_stats     GROUP BY query_hash, query_plan_hash )
SELECT 
    [Query Num]
   , [Total Executions]
   , [Total CPU]
   , [Avg CPU]
   , [Avg CPU Rank]
   , [Total IO]
   , [Avg IO]
   , [Avg IO Rank]
   , [Total Elapsed Time]
   , [Avg Elapsed Time]
   , [Avg Elapsed Time Rank]
  , DB_Name(qp.dbid) AS [DB Name]   , [Sample Statement Text]
   , qp.query_plan AS [Estimated Query Plan]FROM CTE
OUTER APPLY sys.dm_exec_query_plan([Query Plan Handle]) AS qp --ORDER BY [Avg CPU] DESC
ORDER BY [Avg IO] DESC --ORDER BY [Avg Elapsed Time] DESC


NOTE: This query includes a TOP 50 which improves performance but only looks at very recent data. I would recommend using the predicate on last_execution_time to restrict the data back to a valid time frame. This would still then include the total history for those requests but ensure that you are only looking at active queries.


Here is an example output. In this example I had cleared the cache and only a small test data so the Avg IO rank is the same for each query but this gives you an idea.
TopNQueryiesbyHashandRank

So far I’ve tested this on SQL 2008 R2 and above.
I’m also closing in on finalising the BETA release of my DBA Admin and Performance SSMS reports which will include this query and take this to a whole new level of visualisation.

Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.