Wednesday, November 13, 2013

PowerShell: Clearing the values within a HashTable without removing construct of Keys

I had an interesting questions posed to me today:

"Is it possible to clear the contents of all the values within a HashTable in PowerShell but retaining the Key names (i.e. the construct of the hashtable)."

This sounds perfectly acceptable but unfortunately there is not native way of doing this and only have the following options:
1) Set the HashTable variable to $null
2) Rebuild the HashTable from a definition with empty values

Neither of these 2 options are very dynamic or suited the application.

The solution I thought was to loop through the HashTable keys collection and modify the value of one of the entities, however that results in:

Collection was modified; enumeration operation may not execute.

Our solution is though to use the .Clone() method of the HashTable to build a copy of it in memory just for the sake of looping through it's entities. Here is my example code:

$hash = @{FirstName="Bob";LastName="Smith";Age="23"}
$hash

#create a clone of the hash table, this avoids the 
# "Collection was modified" enumeration error
$hash2 = $hash.Clone();

#loop through each key in the cloned hastable and update the 
# value in the original table
foreach($key in $hash2.keys){$hash[$key] = '';}

#clear the cloned hashtable to free memory
$hash2 = $null;

#just output the hashtable for verification in this demo
$hash





Monday, November 11, 2013

Migration Automation Toolkit (MAT) - Best video I have seen in a while

This would have to be the best video I have seen in a while. It speaks for it self.

View Video

Obviously you need a NetApp storage device for this toolkit to work. But it is built on PowerShell so what I see is an opportunity to take the scripts and modify them for your environment even if you don't have a NetApp device there would be some use in the scripts to help you get started in automating your migrations.

Read about it http://migrate.azurewebsites.net/

Office Online feature enhancements

Some new features in Office Online makes #getitdone much easier regardless of where you and your colleagues are, but most of all regardless of what device you are using

http://blogs.office.com/b/office365tech/archive/2013/11/06/collaboration-just-got-easier-real-time-co-authoring-now-available-in-microsoft-office-web-apps.aspx


Tuesday, November 5, 2013

RML now supports SQL 2012 (and SQL 2014)

If you use the Replay Markup Language (RML) tool set then like me you will have been frustrated by the fact that it hadn't been updated to work with trace files captured from SQL 2012. You could probably assume that this was because the SQL Profiler Trace feature for purpose of trace capture (data engine only) is flagged for depreciation in future released (see http://technet.microsoft.com/en-us/library/ms181091.aspx).

Thankfully last week a new version was released which fully support SQL 2012 and even SQL 2014 CTP2 trace file definitions. I have just gone through the process of uninstalling my RML tools, installing the new version, and straight away I can process the trace files captured on a SQL 2012 instance. Hurray!

Be warned though you cannot use the new version of the Reporter tool to access any databases for previously processed trace data. I typically use this tool for performance analysis so I might keep the analysis databases around for a few weeks, so this just means I have to reprocess the trace files to generate reports compatible with the new version. Still worth it to get the benefit of the new supportability.


Description of the Replay Markup Language (RML) Utilities for SQL Server
http://support.microsoft.com/kb/944837

If you find that you cannot click on the navigational hyperlinks on the Main Performance Overview report (e.g. "Unique Batches", "Unique Statements", "Interesting Events", etc) then you will need to install the hotfix or ReportViewer.

Update fixes several Report Viewer issues after you install Visual Studio 2010 Service Pack 1
http://support.microsoft.com/kb/2549864
.....Or go directly to the download location at https://connect.microsoft.com/VisualStudio/Downloads/DownloadDetails.aspx?DownloadID=37409


Parsing SQL ERRORLOG's for errors with PowerShell

I've been finding the need to parse / grep the SQL Server ERRORLOG of late to look for any Errors which have occurred and for that I have been turning to my trusted friend PowerShell.

The cmdlet that gives us all the power here is Select-String

I will admit the hardest part about the use of the Select-String cmdlet for me was learning RegEx, but once you have the basics it is very powerful and this approach could be used for any number of applications.

So at the very basic level the following command can be used to report the lines with errors from the latest SQL ERRORLOG

Select-String -Path .\ERRORLOG -Pattern "(error:)\s\d{1,6}"

You could even use a wildcard in the path parameter to look for historical files, but for my need I was only interested in the current file.

Now that is all good and well but SQL Server doesn't record the description of the error on the same line as the error number, so I needed a way to pull the next line from the file as well. For this I used the powershell pipeline to then loop through each of the matched objects returned by Select-String using the Foreach-Object cmdlet and within the scriptblock of that cmdlet I first output the line we retrieved with Select-String, and then use Get-Content to pull in the details from the file and retrieve only the line number I am interested in (remember the Get-Content is a zero based array so I don't have to do anything with the line number from the Select-String matched object to get the next line.... think about it and you will understand).

Select-String -Path .\ERRORLOG.1 -Pattern "(error:)\s\d{1,6}" | Foreach-object {Write-Host "$($_.filename):$($_.linenumber):$($_.line)"; Write-Host "$($_.filename):$($_.linenumber+1):$((Get-Content "$($_.filename)")[$_.linenumber])"}

The performance of this one-line approach is not the greatest because it has to run the Get-Content cmdlet for each matched line. So if you are looking for speed this two-line approach has you covered there. The only downside is that if you change the file name you are looking at then you have to update this twice, where as the one line approach passes that value along the pipeline.

$filedata = (Get-Content .\ERRORLOG)
Select-String -Path .\ERRORLOG -Pattern "(error:)\s\d{1,6}" | Foreach-object {Write-Host "$($_.filename):$($_.linenumber):$($_.line)"; Write-Host "$($_.filename):$($_.linenumber+1):$($filedata[$_.linenumber])"}


Now if you were processing multiple files then you could do something like this multiline or scripted version.

$filelist = Get-ChildItem .\ERRORLOG*;
Foreach ($file in $filelist)
{
$filedata = (Get-Content $file.Name);
Select-String -Path $file.Name -Pattern "(error:)\s\d{1,6}" | Foreach-object {Write-Host "$($_.filename):$($_.linenumber):$($_.line)"; Write-Host "$($_.filename):$($_.linenumber+1):$($filedata[$_.linenumber])"}
} 

Of if you want to execute that from the powershell console then the two-line approach would be:
$filelist = Get-ChildItem .\ERRORLOG*;
Foreach ($file in $filelist){$filedata = (Get-Content $file.Name);Select-String -Path $file.Name -Pattern "(error:)\s\d{1,6}" | Foreach-object {Write-Host "$($_.filename):$($_.linenumber):$($_.line)"; Write-Host "$($_.filename):$($_.linenumber+1):$($filedata[$_.linenumber])"}}


Of if you are looking to summarize the number of errors occuring from all of the ERRORLOG files then the following provides an insight into that

Select-String -Path ERRORLOG* -Pattern "(error:)\s\d{1,6}" | Group-Object -Property {$_.Line.Substring(0,10)},{$_.Matches.Value} | Sort-Object Name


Happy investigations :)

If you are looking for a RegEx reference here are the ones I use:
http://www.regular-expressions.info/


Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.