Thursday, February 25, 2016

Generating Certificates for Desired State Configuration lab/test environments

A colleague pointed out to me today that it has been a long time since I blogged, and you know what he is right as I have mostly been posting tweets about interesting news. Given I was taking some colleagues through setting up a Desired State Configuration lab environment and ran into an issue with the certificates we were trying to use, and to make it worse the issue is something I have faced in the past but forget due to time lag, this seems like a great topic to blog about.

Now a search in your favorite search engine would show that this is not an uncommon problem but there isn't many who answer it plain and simple.

Do not use New-SelfSignedCertificate to generate a certificate for testing DSC deployments (encrypting credentials) on Windows 2012 R2 or you will most likely receive the error:

    The private key could not be acquired.
        + CategoryInfo          : NotSpecified: (root/Microsoft/...gurationManager:String) [], Ci
       mException
        + FullyQualifiedErrorId : MI RESULT 1
        + PSComputerName        : localhost


Solution #1: Download and use the script/function from Script Center - https://gallery.technet.microsoft.com/scriptcenter/Self-signed-certificate-5920a7c6
 

Solution #2: Generate the certificate with MakeCert.exe from the Visual Studio Command Prompt.

For example, to make a certificate called DSCDemo.

    makecert.exe -r -pe -n "CN=DSCDemo" -sky exchange -ss my -sr localMachine
Further info on makecert can be found at https://msdn.microsoft.com/library/bfsktky3%28v=vs.100%29.aspx

The background, I was trying to setup a DSCPull server with configuration for installing a SQL Server Instance using xSqlPs. This required a Credential to be stored in the config for the 'sa' account, which clearly needed to be encrypted as I didn't want to use plain text passwords in the DSC Configuration. We tried all sorts of methods for exporting the certificate from the DSCPull but eventually the clue was in the fact that even on the DSCPull server the certificate reported no Private Key data in the properties.


    PS C:\Users\mattl> get-item Cert:\LocalMachine\my\{thumbprint} | fl *
    ...

    Extensions               : {System.Security.Cryptography.Oid, System.Security.Cryptography.Oid,
                               System.Security.Cryptography.Oid, System.Security.Cryptography.Oid}
    FriendlyName             : DSCDemo
    IssuerName               : System.Security.Cryptography.X509Certificates.X500DistinguishedName
    NotAfter                 : 2/25/2017 4:02:31 AM
    NotBefore                : 2/25/2016 3:42:31 AM
    HasPrivateKey            : True
    PrivateKey               :
    PublicKey                : System.Security.Cryptography.X509Certificates.PublicKey
    ...

    Issuer                   : CN=DSCDemo
    Subject                  : CN=DSCDemo

Again the clue here was the fact that the Private Key isn't even displayed on the server where it was generated, so we know there is nothing wrong with the export/import, but actually a problem with the way it was generated.

A couple of searches and I found the blog post I used last time I faced this issue, yep isn't that annoying. Turns out this is a problem with the New-SelfSignedCertificate cmdlet and when you generate the certificate with MakeCert.exe as per above the Private Key data is visible on the server it is generated on and also on the server it is imported on (as long as you export the private key data too).

Hope this helps a few others, or at least helps me remember next time I face this problem before I waste a few hours trying to figure it out again ;)
 

BTW in other news did you see the post that WMF 5.0 (aka PowerShell v5) RTM has been re-released. Happy days.

Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Thursday, October 29, 2015

Latest SSMS stream lines the process for adding Azure Firewall Rules at connection

If you haven't heard, Microsoft have broken the release of SQL Management Studio away from the main product. While at this stage it is continuing to be released at the same cadence as the SQL 2016 CTP 2 cycle (e.g. Monthly) the updates are targeting features for existing releases, Azure SQL, and of cause SQL 2016. You can get the latest release @ http://aka.ms/ssms.

I have been working on a project of late with an Azure SQL DB back-end and one thing that has always frustrated me given my role keeps me out in the field is that I am constantly having to log into the portal and update the firewall rules....... Well the latest release of SSMS makes this process much easier.

Now when you try and connect to an Azure SQL Server and a firewall rule on the server blocks your connection, you are no longer prompted with the message informing you that the client is not allowed to connect. Now you are prompted with a dialog asking you to log into Azure and create a firewall rule. It even populates the client IP Address for you and suggests an IP range.


One you authenticate, select the appropriate option for the Firewall Rule (e.g. static or IP range). Click OK and SSMS will place a call to the Azure web services to add the firewall rule.


Once the rule is added, the authentication process continues which is a nice touch because I was half expecting to be prompted to log in again.

 
 Obviously for this to work the user then needs the required permissions on the subscription

This is a great example of how the team is actively working on improving the toolsets and responding to feedback so make sure you keep the connect items rolling in.


In other news, I am presenting with a colleague at MS Ignite Australia next month on some of the new toolset features in SQL 2016. If your attending make sure you attend and come say hi as I will be around the Data Den and Exhibition hall throughout the event.

https://msftignite.com.au/sessions/session-details/1524


Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Tuesday, June 30, 2015

SCOM: Sql Server Management Pack 6.6.0.0 released

The SQL Server Management Pack has been refreshed with all new dashboards, which look to be fully extensible and designed for larger server farms. Full details on how to configure the dashboards in the SQLServerDashboards.docx guide provided at the download links.

The new dashboards do look sexy though with these screen shots taken from the guides.





Version 6.6.0.0 can be downloaded at

SQL 2005-2012: https://www.microsoft.com/en-us/download/details.aspx?id=10631
SQL 2014: http://www.microsoft.com/en-us/download/details.aspx?id=42573

Again check out the specific guides provided on implementing these dashboards.


Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Thursday, June 18, 2015

Announcing the release of my SQL Server DSC Resource Module mlSqlPs on PowerShellGallery and Github

I've been automating the installation of SQL Server components for 10+ years now so it's no surprise I am a fan of PowerShell and Desired State Configuration (DSC). Given my background in large, high volume, and diverse environments, the main objective I always think about is how the process can be easily replicated time and time again but also easily adapted to fit the needs of diverse configurations.

Unfortunately to date I have found the xSqlPs DSC module falls just short of these needs, but that was also the intent of the DSC Resource Kit Waves so that it would provide the "starting blocks" for organizations to customize and extend the resources. I had already started talking with the team about how to improve the SQL resources, but now that the DSC Resource Kit modules has been made open source on github this opens the door for the community to contribute into the standard set of resources. All changes I mention below are pending pull requests back into the main branch of the xSqlPs resource on Github.

The PowerShell Team is also working on their nuget gallery at PowerShellGallery.com, which is going to greatly assist with the new PowerShellGet feature in PowerShell 5.0. The gallery is still in preview however I am starting to contribute to it and this module will be available there (link below).



To provide these features to everyone straight away I have also launched my own version of the SQL DSC module called mlSqlPs. This will allow me to get changes out quickly, while also working to merge the best of my work back into the main branch.

Version 1.0.0.0 of mlSqlPs DSC Module provides the following resources and enhancements.
  • xSqlServerInstall - Enhanced
    Installs SQL Enterprise on target machine.
    This resource has been enhanced to provide capabilities for aligning to SQL best practices (e.g. Data, Log, TempDb paths)
  • xSqlHAService
    Enables SQL high availability (HA) service on a given SQL instance.
  • xSqlHAEndpoint
    Configures the given instance of SQL high availability service to listen port 5022 with given name, and assigns users that are allowed to communicate through the SQL endpoint.
  • xSqlHAGroupconfigures
    An SQL HA group. If the HA group does not exist it will create one with the given name on given SQL instance and add the HA group database(s) to local SQL instance.
  • xWaitForSqlHAGroup
    Waits for an SQL HA group to be ready by checking the state of the HA group of a given name in a given interval till either the HA group is discoverable or the number of retries reached its maximum.
  • xSqlAlias - New
    Configures Client Aliases in both native and wow6432node paths. Supports both tcp and named pipe protocols.
The module contains a help file in both HTML and MD format, as well as some Sample files.



You can get the module at the following locations:

PowerShell Gallery
https://www.powershellgallery.com/packages/mlSqlPs/

GitHub - Want to contribute let me know
https://github.com/matticusau/mlSqlPs


Are you running Windows PowerShell 5.0? You can also get the module with the new Install-Module cmdlet.
#Search for the module on the default gallery
Find-Module -Name mlSqlPs -Repository PSGallery;

#Install the module from the gallery
Install-Module -Name mlSqlPs -Repository PSGallery;



#Get the list of resources
Get-DscResource -Module mlSqlPs;



And there you go, create your configuration, push or pull the mof to the node and watch the magic.


Keep checking the GitHub repository and PowerShell Gallery for updates on this module.


Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Friday, June 5, 2015

SCOM: Fixing a blank report list in Ops Mgr Console

Something I came across in SCOM 2012 R2 is that when users try and view Reports from the OpsMgr console wunderbar they could receive a blank list.


Possible Cause #1 - Incorrect SSRS file system permissions
One thing that can impact this is incorrect file system permissions on the SSRS installation location.

1. On the server where the SSRS instance is installed, navigate to the SSRS install folder. Depending on your environment this could be the default of the following or a custom drive:

C:\Program Files\Microsoft SQL Server\MSRS.\Reporting Services

Where is the numerical version of SQL (e.g. 11) and is the instance name. Such as \MSRS11.MSSQLSERVER\Reporting Services
 

In my lab I customised this to the root of a partition like:
D:\MSRS11.MSSQLSERVER\Reporting Services




2. Open the properties of /ReportManager and /ReportServer separately

3. Ensure that a group which would contain the Windows Accounts of the users trying to access the reports has Read & Execute permissions to the relevant folders.



This is from my lab where it is the default, you may choose to secure this back to a group(s) containing just the users granted rights to your reports but you need to ensure these permissions exist.



Possible Cause #2 - Corrupted identities
If you face this issue are you find that some users, or even just the SCOM Administrator, can access the reports but others cannot. This could be following a restore of the "Reporting" component or even a restore/migration/upgrade of the whole SCOM environment.

The key to understand the cause of this is to look at how both SCOM and SSRS configure security.

First access the Administrator section from the OpsMgr Console wunderbar, and view the User Roles. Open the properties of the Report Operator User Role which contains the users who receive a blank list. Click on the Identity tab.


Take note of the ID which is displayed. Copy this to the clipboard.... you will need it ;)

Now navigate to your Report Server url (e.g. http://localhost/Reports).
Click on the Folder Settings button from the home screen.


Here you will be shown the security of the reports. Check if the ID from earlier is shown, chances are your list will contain a different set of IDs.


IMPORTANT: Do not remove any role assignments

Click the "New Role Assignment" button.

Enter the ID into the "Group of user name" and select Browser, My Reports, and Report Builder roles.


Click OK

The security assignments should now contain the ID of your User Role from OpsMgr.


Now if you have your user refresh the Reports in Ops Mgr Console they should have access to the reports. At least that was the solution I found in my lab.


If you find other causes let me know :)


Reference articles/blogs:
Here are some other articles I have since found which provide troubleshooting guidance on this topic as well for your reference.

http://thoughtsonopsmgr.blogspot.com.au/2011/09/empty-reports-in-scom-whats-happening.html
http://thoughtsonopsmgr.blogspot.ch/2010/08/how-to-scope-scom-r2-reports-for.html





Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.




Friday, May 29, 2015

SQL Server 2016: Query Store first looks

In case you missed the announcement SQL Server 2016 CTP2 preview was publicly announced today. You can read about it in the following post:

http://blogs.technet.com/b/dataplatforminsider/archive/2015/05/27/sql-server-2016-first-public-preview-now-available.aspx



There are already some training materials up on https://technet.microsoft.com/en-us/virtuallabs

The first new feature I noticed was a new type of actual query plan to include, Live Query Statistics.


When enabled the actual query plan is displayed as the query is executed, however it now also displays the real time progress as each stage of the query is executed. You can see this in the following screen shot where the Sort is at 65% (and subsequently all tasks after it are also at 65%). I wasn't fast enough to capture it before it reached the sort but it really does visually show you real time query execution. I'm going to have to find a really large workload to try it out on ;)




One of the new feature Query Store is one I have been "geeking out" on since earlier in the year and I am so glad I can now talk about it publicly and here is my first look at the feature first hand.

First off when you view the properties of a database there is a new configuration section of "Query Store" as seen in the following screen shot.


Before you can access this feature you need to enable it on each database. Like all data collection there is a small resource overhead but it's minimal due to the layer of integration so just enable it.... and given the product is in preview status should there be performance impact file a Connect Item and let the product team know.


So now you have enabled it you will have the ability to customise the data frequency, buffer hardening (flush), data retention/grooming, etc. Here are the defaults.

In my lab I have adjusted these values to be more granular, for the purpose of this blog, but you will need the find the sweet spot for your environment ;)

In the earlier screen shot of the database properties you can also see that there are graphical representations of the Query Store storage usage and a button to flush the storage if you need to.

Now if you refresh the database in Object Explorer you will see a new Query Store branch which contains some useful views.


These views offer some interesting data that previously we would have had to use extra tools to obtain.

The following shows the Overall Resource Consumption view.


Each of the views has the ability to be configured to adjust the metrics, date range, etc



The most interesting of the views in my opinion is the Top Resource Consuming Queries as this shows the top X queries on the top left box, and the query plans which have been used to return the results in the right top box.


When you select a "plan id" it displays the cached plan for that id in the lower pane.



Why is the so interesting, well like you can see in this screen shot I have had two plans for the same query. One with Parallelism and one without..... and from the bubble chart I can see which one has executed at times with the highest resource consumption and which has executed with the least resource consumption (based on my charts configuration). So from this data you can then make a choice if you should "force" a plan, or if you have forced the plan previously then you can 'unforce' it.


This makes all that effort of trying to figure out if a query plan has changed and is causing a degradation in performance, and then having to extract that plan and create the guide for it. Or many other changes in patterns of performance behavior.

Behind the scenes there is a great deal of data being captured about query execution to which we can access from a number of system tables.

For example on the table sys.query_store_query you can see the data around the internals of query times such as compilation time, parse time, etc.



I am definitely looking forward to exploring this feature further and all the time I am going to get back when looking into performance issue.

Obviously this is CTP2 and features/functionality will change before RTM.

Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.

Thursday, May 7, 2015

SQL Server 2016 annoucement

SQL Server 2016 has been publically announced with the public Release Candidate coming in the American Summer. I will be using this post to collate blog posts and announcements regarding SQL 2016, and in particular the features which I am looking forward to.

SQL Server 2016 public preview coming this summer
http://blogs.technet.com/b/dataplatforminsider/archive/2015/05/04/sql-server-2016-public-preview-coming-this-summer.aspx


Query Store
http://slavasql.blogspot.com.au/2014/11/newest-sql-server-feature-query-store.html


Legal Stuff: As always the contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.