tag:blogger.com,1999:blog-29311509503980143042024-03-06T06:47:27.639+10:00MatticusAU BlogMS SQL Server, C#, Powershell, SCOM and other technical interests....Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.comBlogger84125tag:blogger.com,1999:blog-2931150950398014304.post-32271893850710302082022-05-12T08:59:00.001+10:002022-05-12T08:59:05.988+10:00Validate the row counts of PBI tables in Data Flows<p>Recently I had to validate the row counts of some existing tables in PBI Data Flows that I was converting over to a new source. So wanted to make sure the row counts matched between the current and new source systems.</p><p>A simple way to do this is to add a blank query and use the Table.FromRecords and Table.RowCount functions:</p><div style="box-sizing: border-box; font-family: Consolas, "Courier New", monospace; font-size: 12px;"><span style="box-sizing: border-box;">let</span></div><div style="box-sizing: border-box; font-family: Consolas, "Courier New", monospace; font-size: 12px;"><span style="box-sizing: border-box;"> Source = Table.FromRecords({</span></div><div style="box-sizing: border-box; font-family: Consolas, "Courier New", monospace; font-size: 12px;"><span style="box-sizing: border-box;"> [Table = "Table 1 Original", RowCount = Table.RowCount(#"Table 1 Original")],</span></div><div style="box-sizing: border-box; font-family: Consolas, "Courier New", monospace; font-size: 12px;"><span style="box-sizing: border-box;"> [Table = "Table 1 New", RowCount = Table.RowCount(#"Table 1 New")],</span></div><div style="box-sizing: border-box; font-family: Consolas, "Courier New", monospace; font-size: 12px;"><span style="box-sizing: border-box;"> [Table = "Table 2 Original", RowCount = Table.RowCount(#"Table 2 Original")],</span></div><div style="box-sizing: border-box; font-family: Consolas, "Courier New", monospace; font-size: 12px;"><span style="box-sizing: border-box;"> [Table = "Table 2 New", RowCount = Table.RowCount(#"Table 2 New")]</span></div><div style="box-sizing: border-box; font-family: Consolas, "Courier New", monospace; font-size: 12px;"><span style="box-sizing: border-box;"> }) </span></div><div style="box-sizing: border-box; font-family: Consolas, "Courier New", monospace; font-size: 12px;"><span style="box-sizing: border-box;">in</span></div><div style="box-sizing: border-box; font-family: Consolas, "Courier New", monospace; font-size: 12px;"><span style="box-sizing: border-box;"> Source</span></div><p><br /></p>Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-50972156608461015702022-03-04T15:35:00.001+10:002022-03-04T15:35:05.920+10:00Azure Synapse Notebook variable & parameter language context<p>While designing a data platform using Azure Synapse (or Azure Data Factory) it is quite common to want to use variables and/or parameters to control various aspects of your code execution. One common example of this is parameterizing your pipelines for environment configuration control, that is to distinguish between Dev, Test, Prod.</p><p>There are many articles already about how to parameterize an Azure Notebook, so I won't rehash that in detail. Here is my favourite for reference:</p><p></p><ul style="text-align: left;"><li><a href="https://docs.microsoft.com/en-US/Azure/synapse-analytics/synapse-notebook-activity?tabs=classical#passing-parameters">https://docs.microsoft.com/en-US/Azure/synapse-analytics/synapse-notebook-activity?tabs=classical#passing-parameters</a></li><li><a href="https://microsoft-bitools.blogspot.com/2021/11/synapse-pipeline-pass-parameter-to.html">https://microsoft-bitools.blogspot.com/2021/11/synapse-pipeline-pass-parameter-to.html</a></li></ul><p></p><p><br /></p><p>What isn't well documented is <b>the Notebook default language will be the language context that the parameter is set in</b>. Regardless of if you use a language magic command in your Parameterized Cell.</p><p>It makes sense if you stop and think about it. It can catch you out though if your Notebooks require multiple languages to complete their tasks. For example, let's assume this scenario.</p><p><b>Scenario</b>: To interact with ADLS and other sources you use PySpark, but for creating data views etc you prefer Spark SQL due to language familiarity (TSQL), code formating, and maybe code migration. Due to the bulk of your code cells being Spark SQL that is the language you have set the Notebook to, therefore your PySpark cells use the %%pyspark magic command.</p><p>My pipeline has a parameter</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEiLZoxlGYFX-vDPRyNSecvfNNFK_0lcswfuS2ZuKnsGOUdYoUUHEfzHL6C0AtJidh1RailG6It7i8QXKJsesn_fJtX4ukPCj39fB4qkfi9Vs70jInoz_z1LQMNe_cKln5D2uvoVpVISJLCpVofGNx2VRk67AvuyrsYMVruupl1hyzeqlTcnbuRQvR0LAw=s726" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="209" data-original-width="726" height="92" src="https://blogger.googleusercontent.com/img/a/AVvXsEiLZoxlGYFX-vDPRyNSecvfNNFK_0lcswfuS2ZuKnsGOUdYoUUHEfzHL6C0AtJidh1RailG6It7i8QXKJsesn_fJtX4ukPCj39fB4qkfi9Vs70jInoz_z1LQMNe_cKln5D2uvoVpVISJLCpVofGNx2VRk67AvuyrsYMVruupl1hyzeqlTcnbuRQvR0LAw=s320" width="320" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEhFBT-mlhtyxM_1fkJ9ObhtQjJt-zU8X7HvL_7S7igtFw_s07hyIfsByVqs9bPk6FDL8Kr2lerDFaEqtbxxVOqclpp1bJEprSCPKNJpVv6WrQ9NDeI3W63ROfUhBbgJTPFFf4KQGS-I5KO9htDunOCamIdMsCRTU_6laD93mhktCMc1JzXoJVB5VTcmTA=s1035" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="464" data-original-width="1035" height="143" src="https://blogger.googleusercontent.com/img/a/AVvXsEhFBT-mlhtyxM_1fkJ9ObhtQjJt-zU8X7HvL_7S7igtFw_s07hyIfsByVqs9bPk6FDL8Kr2lerDFaEqtbxxVOqclpp1bJEprSCPKNJpVv6WrQ9NDeI3W63ROfUhBbgJTPFFf4KQGS-I5KO9htDunOCamIdMsCRTU_6laD93mhktCMc1JzXoJVB5VTcmTA=s320" width="320" /></a></div><br /><p>The Notebook activity is passing that parameter through via dynamic values</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEgQrRo7-WsCvmZd6GMNvkAUrlrAq-1oVnleKrKjsNT8wtBRtwzfqqdZczs9V4kWTdFwSSKHSc2J8oui6sXN4Dtk_MQBKXxKs2mFlrUOo1NorK8zj4TXfwlPnvTfzMU5NsC7_rDdMLJR1k4fu9QwHJjZ17tqed9EIxPwegT2aRHSVQmS0xlNqTcs8JtqlA=s1035" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="464" data-original-width="1035" height="143" src="https://blogger.googleusercontent.com/img/a/AVvXsEgQrRo7-WsCvmZd6GMNvkAUrlrAq-1oVnleKrKjsNT8wtBRtwzfqqdZczs9V4kWTdFwSSKHSc2J8oui6sXN4Dtk_MQBKXxKs2mFlrUOo1NorK8zj4TXfwlPnvTfzMU5NsC7_rDdMLJR1k4fu9QwHJjZ17tqed9EIxPwegT2aRHSVQmS0xlNqTcs8JtqlA=s320" width="320" /></a></div><br /><p>The notebook is set to Spark SQL as the default language</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEg67t44fyWKflaDPRm56cpo2-Kk2rGUEn-p3HUobcQFsvuf7oroh9I-1NzkQzWsWrL1df2HPBrYRYKh05C428q-EcuB2v8jCZnm4bWnSjJrkWQH6rdTBDVy5MHr1r31VZ4QxsIcu6NNrKKH03vGwW5zavpMsFYh2pm2jtSbDydkMOfbEbSMVIG8ShoYCg=s325" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="86" data-original-width="325" height="85" src="https://blogger.googleusercontent.com/img/a/AVvXsEg67t44fyWKflaDPRm56cpo2-Kk2rGUEn-p3HUobcQFsvuf7oroh9I-1NzkQzWsWrL1df2HPBrYRYKh05C428q-EcuB2v8jCZnm4bWnSjJrkWQH6rdTBDVy5MHr1r31VZ4QxsIcu6NNrKKH03vGwW5zavpMsFYh2pm2jtSbDydkMOfbEbSMVIG8ShoYCg=s320" width="320" /></a></div><br /><p>The following screenshots have used the pipeline execution and notebook snapshot to trace the output/state of each step and highlight what is happening:</p><p>We can see the Parameterized Cell defined in PySpark code and then the Runtime Parameter being passed in and set in Spark SQL</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEhqTYusIB6JERsgn4uYx5YhAsTgJb8dX-TbT4P7aBRykTYMLtGVIQH5v4n2kqOxPnRlyzWV9MIkPSBRyxWPayf_RkGt0LXiOTlnQis6Jz4hx7TYcrTZTB_sey-wjLXYKsk8THcmUTEgrt7utAiergOL6dQFxdz1tvxOu-xC49Ka6Xv8cIf_d04xHa-zTw=s770" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="390" data-original-width="770" height="162" src="https://blogger.googleusercontent.com/img/a/AVvXsEhqTYusIB6JERsgn4uYx5YhAsTgJb8dX-TbT4P7aBRykTYMLtGVIQH5v4n2kqOxPnRlyzWV9MIkPSBRyxWPayf_RkGt0LXiOTlnQis6Jz4hx7TYcrTZTB_sey-wjLXYKsk8THcmUTEgrt7utAiergOL6dQFxdz1tvxOu-xC49Ka6Xv8cIf_d04xHa-zTw=s320" width="320" /></a></div>Using some simple output commands in those respective languages we trace the current values.<div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEiMPz85PmrwiT2qUVjfgK0ZWGIwOxVsSX80q8KQP3zjedIZmgG5XouozfRfzbQSpc4rFX0Hf7O9uDQXTw33d41Qm_BY89ZWM0KJ3Dwm8zNQnXzbgr1UDlgXRsdqOZ7XY2bJUeyA4wdpFxPRmQbL5EYVqn5Muqcqj9fdIfqH3mb8bVt2B8ydUk1hlCGVjw=s484" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="434" data-original-width="484" height="287" src="https://blogger.googleusercontent.com/img/a/AVvXsEiMPz85PmrwiT2qUVjfgK0ZWGIwOxVsSX80q8KQP3zjedIZmgG5XouozfRfzbQSpc4rFX0Hf7O9uDQXTw33d41Qm_BY89ZWM0KJ3Dwm8zNQnXzbgr1UDlgXRsdqOZ7XY2bJUeyA4wdpFxPRmQbL5EYVqn5Muqcqj9fdIfqH3mb8bVt2B8ydUk1hlCGVjw=s320" width="320" /></a></div><br /><div><br /></div><div>This shows that the PySpark variable defined in our Parameterized Cell was not replaced and actually carries the value of 'dev'. The parameter passed into the Notebook was set in Spark SQL and the output of that language correctly has the value 'prod'.</div><div><br /><p>That's not necessarily a problem, except if we need to reference the Parameter value from PySpark (or another language). We can recast the value from Spark SQL to PySpark with some simple code.</p><div style="background-color: #fffffe; font-family: Consolas, "Courier New", monospace; font-size: 13px; line-height: 17px; white-space: pre;"><div>%%pyspark</div><div>param_df = spark.sql(<span style="color: #a31515;">'SELECT ${env_selection}'</span>)</div><div>env_selection = param_df.columns[<span style="color: #098658;">0</span>]</div></div><p>Now we can see the Print function in our PySpark code return the correct value from the Parameter.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEiR0r3XQrZ6RpkrdwdUeKNyu8QJq87vpMjqiJ4Jv3lznVOwUfWTaHg4oOFGCiXS6yrqq_bd73WUdtvcmaQ0iPVvLDZnj1GzOZYI6l0d253msNIaqrwsx61CBISEGF6xmLaL_qeIIfQ0K9nEbYmkKkhFwTUofiw5tLGyxVXvZLjkAsOflrtYxynwy3ZVPQ=s503" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="290" data-original-width="503" height="184" src="https://blogger.googleusercontent.com/img/a/AVvXsEiR0r3XQrZ6RpkrdwdUeKNyu8QJq87vpMjqiJ4Jv3lznVOwUfWTaHg4oOFGCiXS6yrqq_bd73WUdtvcmaQ0iPVvLDZnj1GzOZYI6l0d253msNIaqrwsx61CBISEGF6xmLaL_qeIIfQ0K9nEbYmkKkhFwTUofiw5tLGyxVXvZLjkAsOflrtYxynwy3ZVPQ=s320" width="320" /></a></div><br /><p><br /></p><p>This behavior also applies to any variables you define within a specific language via the magic command.</p><p><br /></p><p>Obviously, the right outcome here is to really think about what your Notebook's default language will be set to based on the requirements and behaviors. As I stated above though there are times you may need to use multiple languages and pass variable/parameter values between languages to achieve some functionality.</p><p><br /></p></div>Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-43414301933768858112018-11-15T11:27:00.000+10:002018-11-30T09:09:02.884+10:00Power BI Resources<div dir="ltr" style="text-align: left;" trbidi="on">
The following is a list of resources useful for Power BI training and product news:<br />
<br />
<h2 style="text-align: left;">
Product News</h2>
<br />
Power BI Blog<br />
<a href="https://powerbi.microsoft.com/en-us/blog/" target="_blank">https://powerbi.microsoft.com/en-us/blog/</a><br />
<div>
<br /></div>
<div>
<br /></div>
<h2 style="text-align: left;">
Syntax and Language Reference</h2>
<div>
<div>
Dax Reference</div>
<div>
<a href="https://aka.ms/dax" target="_blank">https://aka.ms/dax</a> </div>
</div>
<div>
<br />
Visuals Official Help<br /><a href="https://docs.microsoft.com/en-us/power-bi/visuals/power-bi-visualization-types-for-reports-and-q-and-a" target="_blank">https://docs.microsoft.com/en-us/power-bi/visuals/power-bi-visualization-types-for-reports-and-q-and-a</a><br />
<br />
Visuals Reference<br />
<a href="https://www.sqlbi.com/ref/power-bi-visuals-reference/" target="_blank">https://www.sqlbi.com/ref/power-bi-visuals-reference/</a></div>
<div>
<br /></div>
<h2 style="text-align: left;">
Social Media Resources</h2>
<div>
<div>
Follow these resources on twitter:</div>
<div>
Power BI <a href="https://twitter.com/MSPowerBI" target="_blank">@MSPowerBI</a> </div>
<div>
Will Thompson <a href="https://twitter.com/Will_MI77" target="_blank">@Will_MI77</a></div>
</div>
<div>
<br /></div>
<h2 style="text-align: left;">
Training and Other Resources</h2>
<div>
<div>
Youtube channels (for training and learning):</div>
<div>
Power BI <a href="https://www.youtube.com/user/mspowerbi" target="_blank">https://www.youtube.com/user/mspowerbi</a> </div>
<div>
Guy in a Cube <a href="https://www.youtube.com/channel/UCFp1vaKzpfvoGai0vE5VJ0w" target="_blank">https://www.youtube.com/channel/UCFp1vaKzpfvoGai0vE5VJ0w</a> </div>
</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em></div>
</div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-11691624579724038642018-07-07T00:14:00.003+10:002018-07-07T00:14:40.108+10:00PowerBI calling a REST Api for each row in a table.<div dir="ltr" style="text-align: left;" trbidi="on">
Take this scenario.<br />
Your writing a PowerBI report to pull data from a REST Api. The Api doesn't provide a search route which returns all the rows and data you require in one call. You can retrieve a list of ID's for the object type you require but a separate call is required to get the individual details per row. Or you may have an existing table with the IDs stored, and require additional details which is provided by an API call per row.<br />
<br />
I was asked about this scenario by a colleague, and it is definitely able to be solved with PowerBI. I will urge caution though as if you have large recordset on the base table, and you plan to do an API call per row, that will be slow and result in a large number of calls to the API in a short duration which could be flagged as an attack by some monitoring systems.<br />
<br />
<b><u>Demo API</u></b><br />
<br />
For this blog I have provided a demo NodeJS API that you can download and run in VSCode, but you could also use any public API, such as my SQL Versions API (details on that will follow at a later time).<br />
<br />
To run the API, follow the instructions in <a href="https://github.com/Matticusau/SQLDemos/blob/master/PowerBI_ApiDataSource/README.md" target="_blank">README.md</a>.<br />
<br />
<br />
<b><u>Demo Data</u></b><br />
<br />
The demo API has some static sample data. A collection of Course Instructors, which has a mapping to the Course collection and also a mapping to a Person collection. The complete sample data was taken from an old .Net app sample (<a href="https://msdn.microsoft.com/library/bb399731(v=vs.100).aspx" target="_blank">https://msdn.microsoft.com/library/bb399731(v=vs.100).aspx</a>).<br />
<br />
Here is a snippet of the structure and data:<br />
<br />
CourseInstructor = [<br />
{<br />
"CourseID": "1045",<br />
"PersonID": "5"<br />
},<br />
...<br />
];<br />
Course = [<br />
{<br />
"CourseID": "1045",<br />
"Title": "Calculus",<br />
"Credits": "4",<br />
"DepartmentID": "7"<br />
},<br />
...<br />
];<br />
Person = [<br />
{<br />
"PersonID": "1",<br />
"LastName": "Abercrombie",<br />
"FirstName": "Kim",<br />
"HireDate": "1995-03-11 00:00:00.000",<br />
"EnrollmentDate": null<br />
},<br />
...<br />
];<br />
<br />
<br />
<br />
<b><u>Lets get start</u></b><br />
<br />
<b><i>1. Get the base source</i></b><br />
For this walk through lets start with the CourseInstructor data. This is a list of CoursID's and PersonIDs, but no other details. We will then use the individual API calls to build the completed table.<br />
In the Demo API the API route is http://localhost:3000/api/courseinstructor.<br />
<br />
In PowerBI select <b>Get Data</b><br />
<br />
In the form type "web" to narrow the results and locate the<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhAC0cC4IE-AjSjep4KjcSXb8cMwPuHuBddpwCkoS5I2BAdOTD0FX2aFnC6v6Y5Awa8jR_Ue8dLA3h92pbJs9-dyZmBx1vnLifAGNo-gaoKvqbeRIddmLBEbZq4dKtRCggDOgrDE6Ltake3/s1600/01.GetData.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" data-original-height="230" data-original-width="575" height="128" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhAC0cC4IE-AjSjep4KjcSXb8cMwPuHuBddpwCkoS5I2BAdOTD0FX2aFnC6v6Y5Awa8jR_Ue8dLA3h92pbJs9-dyZmBx1vnLifAGNo-gaoKvqbeRIddmLBEbZq4dKtRCggDOgrDE6Ltake3/s320/01.GetData.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
In the next screen enter the API route (e.g. http://localhost:3000/api/courseinstructor). This should be a HTTPS uri for security.</div>
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEisRTy8Sxaj27xGohQzlb1ULbRszz07awrWpZNCk6e34DGkU50zhifk2o1lfc4twpxV6EMDLtM0B5H9NW4w1zgmunEnt82jlBVtrUP0SBVQ43cCdHB86u9O3JRxZ2wl49YySWvGGC4kct_9/s1600/02.FromWeb.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="459" data-original-width="711" height="206" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEisRTy8Sxaj27xGohQzlb1ULbRszz07awrWpZNCk6e34DGkU50zhifk2o1lfc4twpxV6EMDLtM0B5H9NW4w1zgmunEnt82jlBVtrUP0SBVQ43cCdHB86u9O3JRxZ2wl49YySWvGGC4kct_9/s320/02.FromWeb.png" width="320" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: left;">
If your API needs Authorization, or a Token, then you should set the appropriate headers. I have an example in one completed solution at the end.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<b><i>2. Convert to table</i></b></div>
<div class="separator" style="clear: both; text-align: left;">
The results will first be shown in a list as PowerBI will use the Json.Document() function to retrieve the data.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEizg1lQjFxehNCEeNu_aj-QQdpJUudyD2sJrCh9WVprK3eQOAQl8oXyQDQHiud63LN_mh_8oJUR8huymlhjIX3jRha4I45XGYX2TJgeM0fKTK4z2EJju91ZQBv8WT54W9rvyderqNraWuQG/s1600/03.ResultsList.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="278" data-original-width="659" height="134" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEizg1lQjFxehNCEeNu_aj-QQdpJUudyD2sJrCh9WVprK3eQOAQl8oXyQDQHiud63LN_mh_8oJUR8huymlhjIX3jRha4I45XGYX2TJgeM0fKTK4z2EJju91ZQBv8WT54W9rvyderqNraWuQG/s320/03.ResultsList.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
Click the "To Table" button in the menu ribbon.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEigyHy_mfSvGaeUptzDXFN_hX6bpxRhCbvIbPYuGGzn1g944xlG3mZoL1SZOfKNjEJmdme28kpD2QLN_Pw9kpAaVRWdoVC1oB3vBgs7tPQ6QFl_lhfyXumZXvMlWHEwC5wDnJbvkPnQk8R9/s1600/04.ToTable.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="143" data-original-width="473" height="96" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEigyHy_mfSvGaeUptzDXFN_hX6bpxRhCbvIbPYuGGzn1g944xlG3mZoL1SZOfKNjEJmdme28kpD2QLN_Pw9kpAaVRWdoVC1oB3vBgs7tPQ6QFl_lhfyXumZXvMlWHEwC5wDnJbvkPnQk8R9/s320/04.ToTable.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
In the To Table dialog you should be able to leave the defaults. However if the API JSON data isn't very structured, in that each document has different elements and size, then you may need to experiment with the error handling options.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi2acUEPg_XoLwdoEkwQ5paLQrL9m8Cm-B1_-mO3UymhE5STkNEMEeo7FM5O-F3xM4zcKcr5JZM6UlbAMCwYpcirXbfRYFZgaU5s1wVcpY5kreSLI32-OneQKrPAuPUvSta95tSz3idfBO7/s1600/05.ToTableDialog.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="274" data-original-width="709" height="123" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi2acUEPg_XoLwdoEkwQ5paLQrL9m8Cm-B1_-mO3UymhE5STkNEMEeo7FM5O-F3xM4zcKcr5JZM6UlbAMCwYpcirXbfRYFZgaU5s1wVcpY5kreSLI32-OneQKrPAuPUvSta95tSz3idfBO7/s320/05.ToTableDialog.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both;">
<b><i>3. Expand the columns</i></b></div>
<div class="separator" style="clear: both;">
The results will now be a typical referenced table which you need to expand the columns on. Click on the expand columns button.</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj_hBvDlmqy53c1Ayu5WUJNra2tN8-3E9VQVafPCM6lUDiqHD4WR3fUbH4LlzXBG7NbespLub8lQQqEcWs6UGo4Ow9x5kgZiZqkH45L-cYk1bc7dM8X7KG2e6DB0s6JuoQcLfu8F_FZC6sD/s1600/06.ExpandColumnsButton.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="224" data-original-width="155" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj_hBvDlmqy53c1Ayu5WUJNra2tN8-3E9VQVafPCM6lUDiqHD4WR3fUbH4LlzXBG7NbespLub8lQQqEcWs6UGo4Ow9x5kgZiZqkH45L-cYk1bc7dM8X7KG2e6DB0s6JuoQcLfu8F_FZC6sD/s1600/06.ExpandColumnsButton.png" /></a></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
Select the required columns. This will depend on your API data, in the demo data cases we want both columns as we will add additional API routes to get that data. </div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
My tip is to untick the "Use original column name as prefix" option. Or if you want to use that make sure you have renamed the column to a meaningful name prior to this step. You can also always add a meaningful name by editing the step settings afterwards, but before you proceed with other steps.</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjuCpwjApF1qiG2dvAb5L2zy142_PaM5Gm4N59R1D9JNUq6hHjX1LBZS-0Yo_Y5L-MFsAIIv3WmYTsOduGqiYeqy4Hs-NM5hR6N3p0FAeYwFbxxmO1uDtaetlKvW_m-4qpi3K88FPhWHAQt/s1600/07.ExpendColumnDialog.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="290" data-original-width="499" height="185" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjuCpwjApF1qiG2dvAb5L2zy142_PaM5Gm4N59R1D9JNUq6hHjX1LBZS-0Yo_Y5L-MFsAIIv3WmYTsOduGqiYeqy4Hs-NM5hR6N3p0FAeYwFbxxmO1uDtaetlKvW_m-4qpi3K88FPhWHAQt/s320/07.ExpendColumnDialog.png" width="320" /></a></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
The results should now be expanded as required.</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgFAakmM9gTDTmbonq2-bEOfwFKdAO1ZeCOWcMCm2TNZg6QcuV-G6fa0PGLwd4N_jI2jmwubnXr2zcSTFP5UEHtkE5wXUyR7crbJYKq1HiKaoi0LpLK-nYKIV87PHoKmS8SnAW4JSIMEz2C/s1600/08.ResultsExpanded.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="225" data-original-width="272" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgFAakmM9gTDTmbonq2-bEOfwFKdAO1ZeCOWcMCm2TNZg6QcuV-G6fa0PGLwd4N_jI2jmwubnXr2zcSTFP5UEHtkE5wXUyR7crbJYKq1HiKaoi0LpLK-nYKIV87PHoKmS8SnAW4JSIMEz2C/s1600/08.ResultsExpanded.png" /></a></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
<b><i>4. Expand the data with additional API calls per row</i></b></div>
<div class="separator" style="clear: both;">
Now for the fun part. We now have rows of data which contain our ID(s). To expand this data we need to make additional API calls for each row. To achieve this we add a custom column, and dynamically build the API Uri.</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
Click the <b>Add Column</b> tab in the ribbon, and click <b>Custom Column</b>.</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
Give the column a meaning full name (e.g. CourseCollection).</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
Here is the tricky part, we have to dynamically build the API Uri. To do this we can use the Text.Combine() function to concatenate the uri and add the column value where the parameter should be. For our demo data the completed Uri will be http://localhost:3000/course/[CourseID]</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
Adjust the following expression as required:</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
= Json.Document(</div>
<div class="separator" style="clear: both;">
Web.Contents(</div>
<div class="separator" style="clear: both;">
Text.Combine({"http://localhost:3000/api/course/",[CourseID]})</div>
<div class="separator" style="clear: both;">
)</div>
<div class="separator" style="clear: both;">
)</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
If you need to provide headers for authorization then use the parameters of the web.Contents() function, for example:</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
= Json.Document(</div>
<div class="separator" style="clear: both;">
Web.Contents(</div>
<div class="separator" style="clear: both;">
Text.Combine({"http://localhost:3000/api/course/",[CourseID]}), [Headers=[#"Z-API-Token"=""]])))</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgYP_1iMQMnWu_kANtksSjTU-gnRDClqd2LlUl00yJ_dmCSDrFNdOhl4L69s_kGfEPGwDLX-EGiHHlGDqEERaxxHIbyW4CFegw88CrtlUWr_S9AgKZR5alrAtFxnbfMqUzE41Yevd_0UANg/s1600/10.CustomColumnDialog.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="426" data-original-width="711" height="191" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgYP_1iMQMnWu_kANtksSjTU-gnRDClqd2LlUl00yJ_dmCSDrFNdOhl4L69s_kGfEPGwDLX-EGiHHlGDqEERaxxHIbyW4CFegw88CrtlUWr_S9AgKZR5alrAtFxnbfMqUzE41Yevd_0UANg/s320/10.CustomColumnDialog.png" width="320" /></a></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
NOTE: If prompted regarding the <b>data privacy</b>, click <b>continue </b>and set as appropriate.</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
IMPORTANT: The process of updating the table may take some time depending on the number of rows and speed of the API.</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
The results should now include the custom column (e.g. CourseCollection). </div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwaMaKC36LmDrmkpQ4c8ZdWQ_jw4dsKISkKKVrdm2JVkfuXT0Y8ZUS82VJEThqnjAmvwtumNdGDXVFPij39j2k58Z8tXVVzW5-P6rMO-rhoMYRMSMu0bWzkIfbMFc8yxvEanXAZMzUL1Jy/s1600/11.ResultsWithCustomColumn.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="222" data-original-width="420" height="169" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwaMaKC36LmDrmkpQ4c8ZdWQ_jw4dsKISkKKVrdm2JVkfuXT0Y8ZUS82VJEThqnjAmvwtumNdGDXVFPij39j2k58Z8tXVVzW5-P6rMO-rhoMYRMSMu0bWzkIfbMFc8yxvEanXAZMzUL1Jy/s320/11.ResultsWithCustomColumn.png" width="320" /></a></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
This table needs to be expanded, the same as earlier by clicking on the expand columns button and then select the desired columns to include. In my demo data case, I am only interested in the Title column.</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgqfKGhPtfi51SHdKktmecxScBPbsoQqcoxr3x6aAjFnUyJvYLjUqio6zwWivPyFjag5Idn5SRNIvxdFpdEMda9MFueoT5-yerMo3v5KoqDZGAaG3aQ_zj5KNnfbJnf2vr3FQC7q4ar_Dga/s1600/12.ExpandCustomColumn.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="288" data-original-width="434" height="212" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgqfKGhPtfi51SHdKktmecxScBPbsoQqcoxr3x6aAjFnUyJvYLjUqio6zwWivPyFjag5Idn5SRNIvxdFpdEMda9MFueoT5-yerMo3v5KoqDZGAaG3aQ_zj5KNnfbJnf2vr3FQC7q4ar_Dga/s320/12.ExpandCustomColumn.png" width="320" /></a></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
Your table should now include the data from the additional API call.</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgeTG4WeTxcPZQHmAvI4PmuMwR0WxoMifrXv9024_yifCDdSgnIc8KGMZj-TzuDZYV6ANWPxmmLVpML50sMHSagS6XFMZebFuXlwJmokrzsDI4CwZuh-ODkDUHXuLEYXV-6Lzx9fU_BG4Ml/s1600/13.ExpandCustomColumnResults.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="223" data-original-width="379" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgeTG4WeTxcPZQHmAvI4PmuMwR0WxoMifrXv9024_yifCDdSgnIc8KGMZj-TzuDZYV6ANWPxmmLVpML50sMHSagS6XFMZebFuXlwJmokrzsDI4CwZuh-ODkDUHXuLEYXV-6Lzx9fU_BG4Ml/s320/13.ExpandCustomColumnResults.png" width="320" /></a></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
Continue to model the data as you require, or additional API calls. In the case of the demo data we will repeat these steps but for the Person column and the API Uri route http://localhost:3000/person/[PersonID]</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
The completed table will look like:</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGFA_i2OMqHmQppCVsEo-QWAUZOlKn31ZzT3szRyTvNfh1RD5QTcog-gNjJWsm8yLmp2BbgZdH2diOhI2LPs7Av-1cZMNsilIG3slvtRsVtNucqFQwwfeuJ3qZaFNFV5R08SwddqoPRdhN/s1600/14.CompletedTable.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="226" data-original-width="609" height="118" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGFA_i2OMqHmQppCVsEo-QWAUZOlKn31ZzT3szRyTvNfh1RD5QTcog-gNjJWsm8yLmp2BbgZdH2diOhI2LPs7Av-1cZMNsilIG3slvtRsVtNucqFQwwfeuJ3qZaFNFV5R08SwddqoPRdhN/s320/14.CompletedTable.png" width="320" /></a></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
The query steps will be similar to the following:</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjU44Q6axY1u6Z5porF0efvWuWS50ZKtIJhPwyCqwhMW2P8TZ5h-ObUXJo31RJ3Uh81egwzbzP3me15kpDv97rtlbgUj-mzXx8H2eud4t4mzN27nLWhG9aLyaF9yp8FeXKhMSl_ygzlgT1h/s1600/15.QuerySettings.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="324" data-original-width="244" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjU44Q6axY1u6Z5porF0efvWuWS50ZKtIJhPwyCqwhMW2P8TZ5h-ObUXJo31RJ3Uh81egwzbzP3me15kpDv97rtlbgUj-mzXx8H2eud4t4mzN27nLWhG9aLyaF9yp8FeXKhMSl_ygzlgT1h/s320/15.QuerySettings.png" width="240" /></a></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
The complete M Query for creating the query will be similar to this. This is available from the Advanced Editor of a query, and is a working query for the demo API.</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
let</div>
<div class="separator" style="clear: both;">
Source = Json.Document(Web.Contents("http://localhost:3000/api/courseinstructor")),</div>
<div class="separator" style="clear: both;">
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),</div>
<div class="separator" style="clear: both;">
#"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"CourseID", "PersonID"}, {"CourseID", "PersonID"}),</div>
<div class="separator" style="clear: both;">
#"Added CourseCollection" = Table.AddColumn(#"Expanded Column1", "CourseCollection", each Json.Document(</div>
<div class="separator" style="clear: both;">
Web.Contents(</div>
<div class="separator" style="clear: both;">
Text.Combine({"http://localhost:3000/api/course/",[CourseID]})</div>
<div class="separator" style="clear: both;">
)</div>
<div class="separator" style="clear: both;">
)),</div>
<div class="separator" style="clear: both;">
#"Expanded CourseCollection" = Table.ExpandRecordColumn(#"Added CourseCollection", "CourseCollection", {"Title"}, {"Title"}),</div>
<div class="separator" style="clear: both;">
#"Added PersonCollection" = Table.AddColumn(#"Expanded CourseCollection", "PersonCollection", each Json.Document(</div>
<div class="separator" style="clear: both;">
Web.Contents(</div>
<div class="separator" style="clear: both;">
Text.Combine({"http://localhost:3000/api/person/",[PersonID]})</div>
<div class="separator" style="clear: both;">
)</div>
<div class="separator" style="clear: both;">
)),</div>
<div class="separator" style="clear: both;">
#"Expanded PersonCollection" = Table.ExpandRecordColumn(#"Added PersonCollection", "PersonCollection", {"LastName", "FirstName"}, {"LastName", "FirstName"})</div>
<div class="separator" style="clear: both;">
in</div>
<div class="separator" style="clear: both;">
#"Expanded PersonCollection"</div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<b><u>Demo Files and Source</u></b></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
I have provided a completed PBIX in the github repository. </div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<a href="https://github.com/Matticusau/SQLDemos/tree/master/PowerBI_ApiDataSource" target="_blank">https://github.com/Matticusau/SQLDemos/tree/master/PowerBI_ApiDataSource</a></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
</div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-62933552235898838562018-06-09T00:29:00.000+10:002018-06-09T00:29:22.393+10:00New SQL Version Life Cycle tool<div dir="ltr" style="text-align: left;" trbidi="on">
Introducing my latest SQL Server Life Cycle and Versions lookup tool.<br />
<h2 style="text-align: center;">
<a href="https://sqlversions.azurewebsites.net/">https://sqlversions.azurewebsites.net</a></h2>
Back in 2016 I released the first version of this app and the goals were the same. Provide an interface to easily lookup a SQL version number and find out the support status of that released. All backed with an API that could be queried from scripts and other tools.<br />
<br />
This new release is a complete re-write and includes a modern responsive site written in Angular (because I felt the need to learn).<br />
<br />
With this initial release I am providing the following capabilities in the UI:<br />
<br />
<b><u>Version Search</u></b><br />
<br />
The search feature allows you to lookup a specific release of SQL Server or version number and then see all the releases that match that search result. In the results you can see the release date, mainstream and extended support dates.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhrBsRvxHfPjM8t7eBOwE-_QAHf0oy-B-Er1I6SkO4IYbyMg-n_9jsCuNci6-6kSCXg7GapHAKf7HZb3iPuhIG3mTH6yOYlZQyR1rreuvwAC_85kIiTgDowf9RmDGuA34pdZiQ4GvValumz/s1600/sqlversionswebapp-03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="560" data-original-width="922" height="242" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhrBsRvxHfPjM8t7eBOwE-_QAHf0oy-B-Er1I6SkO4IYbyMg-n_9jsCuNci6-6kSCXg7GapHAKf7HZb3iPuhIG3mTH6yOYlZQyR1rreuvwAC_85kIiTgDowf9RmDGuA34pdZiQ4GvValumz/s400/sqlversionswebapp-03.png" width="400" /></a></div>
<br />
<br />
Clicking on a row in the results will open the<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgc_IvdPy0115iwQJ1N_Dbv0hHuW_ceYE3uE9TTVIWdvGdWRTPnYsRnAJZQ9C8AiTkw4dlViDXB8w4fxBLS04nMAeuN5l_LRdU278bliFKgGYGw-TpoIvUaXq7FGtVhQWFzFKVGGgarKYKi/s1600/sqlversionswebapp-04.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="599" data-original-width="932" height="205" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgc_IvdPy0115iwQJ1N_Dbv0hHuW_ceYE3uE9TTVIWdvGdWRTPnYsRnAJZQ9C8AiTkw4dlViDXB8w4fxBLS04nMAeuN5l_LRdU278bliFKgGYGw-TpoIvUaXq7FGtVhQWFzFKVGGgarKYKi/s320/sqlversionswebapp-04.png" width="320" /></a></div>
<br />
<br />
<b><u>Life Cycle</u></b><br />
<br />
The life cycle search page is just a page to quickly lookup when certain branches of a product release will end mainstream support or extended support. This was created as while onsite consulting I sometimes need the ability to quickly look up life cycle information rather than a specific version. It was also a "wish list feature" from a customer I was working with at the time.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjzVnCMpegAAdbke-dsbnYt7ZDeqcqz-oGjOfu68roqU85O8NiYZPw3486IRFlMOqYuL2tCPlh1mOgHFTBnYCy3tlnSwacACT732gwcgAgVdWJ-tJBzMJ1xwyFFhKEJCOalMRz-hVFnai1e/s1600/sqlversionswebapp-05.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="485" data-original-width="718" height="270" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjzVnCMpegAAdbke-dsbnYt7ZDeqcqz-oGjOfu68roqU85O8NiYZPw3486IRFlMOqYuL2tCPlh1mOgHFTBnYCy3tlnSwacACT732gwcgAgVdWJ-tJBzMJ1xwyFFhKEJCOalMRz-hVFnai1e/s400/sqlversionswebapp-05.png" width="400" /></a></div>
<br />
<br />
<br />
<b><u>Health Check [beta]</u></b><br />
<br />
The health check is a beta release of my end goal which is to allow you to enter your version number and get basic recommendations about how to upgrade. The initial release simply recommends to either start planning to upgrade or to urgently upgrade depending on that versions support status. My vision for this feature is to provide guidance about updates that have been released on the same branch to stay current and also life cycling plans for upgrades to service packs or next release.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWTutSGyZEoYJVesY_sNcgnPZmYRi4mOatDEMZH0eMVFfsj-36L8P90N3FWgp-8T_fzRDz8Kv-RYMfTPiQn0QYMtijvnYwq0ALexo1NN6q1B3Yn2ZFT0do_jOTZR1jkVUkuRb5DFtx6K7Q/s1600/sqlversionswebapp-07.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="428" data-original-width="723" height="189" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWTutSGyZEoYJVesY_sNcgnPZmYRi4mOatDEMZH0eMVFfsj-36L8P90N3FWgp-8T_fzRDz8Kv-RYMfTPiQn0QYMtijvnYwq0ALexo1NN6q1B3Yn2ZFT0do_jOTZR1jkVUkuRb5DFtx6K7Q/s320/sqlversionswebapp-07.png" width="320" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi9sBsGCVb0Bwugm2BirOvXqdCzvvvjJZQJcmpyvAbfMN04MVDAP8rnGSDQW4Y1YedHHx5Y003f5hdSNpxBFcPthM_jXvFvSqnVyRuV6DpU2cj4LVDyIrtwQ3-bjLPahonp-nLcmLfEQYHp/s1600/sqlversionswebapp-06.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="401" data-original-width="726" height="176" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi9sBsGCVb0Bwugm2BirOvXqdCzvvvjJZQJcmpyvAbfMN04MVDAP8rnGSDQW4Y1YedHHx5Y003f5hdSNpxBFcPthM_jXvFvSqnVyRuV6DpU2cj4LVDyIrtwQ3-bjLPahonp-nLcmLfEQYHp/s320/sqlversionswebapp-06.png" width="320" /></a></div>
<br />
<br />
<b><u>API</u></b><br />
<br />
One of my main goals with this app was to ensure that it had an API which could be queried and then used to extend other tools and scripts. For example you could incorporate it into a Health Check Script to retrieve the version number from the SQL Instance and then call the API to get information about the life cycle and support of that release. Or you could use Power BI taking data from your CMDB and performing a lookup against the API for the rows or a search against the API and store that as a query with a relationship link.<br />
<br />
The following code provides an example of using the API in a Health Check script. The code is available at <a href="https://gist.github.com/Matticusau/5778b90507cb7274deebc12cf4360c1c">https://gist.github.com/Matticusau/5778b90507cb7274deebc12cf4360c1c</a><br />
<br />
<pre class="brush:ps"># Import the SQL Module
Import-Module SqlServer;
# Get the version number via appropriate method
$VersionNumber = Invoke-Sqlcmd -ServerInstance $SqlServerName -Query "SELECT SERVERPROPERTY('PRODUCTVERSION')";
$VersionNumber = $VersionNumber.Column1;
# Call the API to get the version information
$VersionData = Invoke-RestMethod -Uri "http://sqlserverbuildsapi.azurewebsites.net/api/builds?version=$($VersionNumber)";
# Want to improve your Health Check script, calculate the health of Support Status
if ($VersionData.ExtendedSupportEnd -le (Get-Date)) {$SupportStatus = 'Critical'}
elseif ($VersionData.MainstreamSupportEnd -le (Get-Date)) {$SupportStatus = 'Warning'}
else {$SupportStatus = 'Ok'}
# format the output data string
$OutputData = @"
Instance = $($Instance.Name)
Version = $($VersionData.BuildVersion)
Product = $($VersionData.ProductName)
Branch = $($VersionData.BranchName)
Update = $($VersionData.BuildUpdate)
MainstreamSupportEnd = $($VersionData.SupportEndMainstream)
ExtendedSupportEnd = $($VersionData.SupportEndExtended)
SupportStatus = $($SupportStatus)
"@
# Return the hashtable
$OutputData
</pre>
<br />
Soon, I will update the samples repository I provided for the previous version to reflect the new API syntax.<br />
<br />
<br />
<br />
<br />
Lastly a note on the data. There is no dynamic link from this data to any Microsoft or other site. The data is provided "as-is" and manually maintained by myself and a few trusted peers. We do our best to make sure it is up to date and accurate but for any business critical or commercial decision make sure you refer to the official sources.<br />
<br />
If you like this tool, or have some ideas for improvements, or even notice inaccuracies in the data please let me know.<br />
<br />
<br />
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-8584613398267387922018-05-27T23:44:00.001+10:002018-05-28T00:18:47.115+10:00Convert CSV files to Parquet using Azure HDInsight<div dir="ltr" style="text-align: left;" trbidi="on">
A recent project I have worked on was using CSV files as part of an ETL process from on-premises to Azure and to improve performance further down the stream we wanted to convert the files to Parquet format (with the intent that eventually they would be generated in that format). I couldn't find a current guide for stepping through that process using Azure HDInsight so this post will provide that.<br />
<br />
Scripts and what samples used in this guide are available <a href="https://github.com/Matticusau/SQLDemos/tree/master/HDInsightConvertCSVtoParquet">https://github.com/Matticusau/SQLDemos/tree/master/HDInsightConvertCSVtoParquet</a><br />
<br />
To follow this blog post make sure you have:<br />
<br />
<ol style="text-align: left;">
<li>Create a resource group in your Azure Subscription</li>
<li>Create a Storage Account within the resource group</li>
<li>Create an Azure HDInsight resource the same resource group (you can use that storage account for HDInsight)</li>
<li>Upload the sample GZip compressed CSV files from the SampleData folder to the Storage Account using Azure Storage Explorer. In my case I uploaded to a container "DataLoad"</li>
</ol>
<div>
The work that we will perform will be within the Jupiter Notebook. </div>
<div>
<br /></div>
<div>
From your Azure Portal locate the HDInsight resource, click the Cluster dashboard quick link</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjP0XoK-4q9xvvUDGoCA0nFV8EQy_SKcdruGRi5-Lbhk4X4cmx5AORcf_8jcQvZwE8MTMeH8LTN4EfiIeUKf0Ii6w3P4yZaP1vzig63sQJCzA5SUn1Bg7RpxL_uDgH9pMW0Njo9UZHCbnmM/s1600/HDInsightConvertCSVtoParquet_01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="169" data-original-width="809" height="66" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjP0XoK-4q9xvvUDGoCA0nFV8EQy_SKcdruGRi5-Lbhk4X4cmx5AORcf_8jcQvZwE8MTMeH8LTN4EfiIeUKf0Ii6w3P4yZaP1vzig63sQJCzA5SUn1Bg7RpxL_uDgH9pMW0Njo9UZHCbnmM/s320/HDInsightConvertCSVtoParquet_01.png" width="320" /></a></div>
<div>
<br /></div>
<div>
Now select the Jupiter Notebook</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUL5mc-8usb6U8UFXtIAlJrXNnSeXRtBwQLmYJun7QwekLW0wE6N4GIlwu-HZxngiNrc4QYSPwSnIKDTGlfj5ETqFk23UhAnsaCxj6diL6wgs6eagA9gWm-wk6XnGyzm_Min9NPckROids/s1600/HDInsightConvertCSVtoParquet_02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1010" data-original-width="385" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUL5mc-8usb6U8UFXtIAlJrXNnSeXRtBwQLmYJun7QwekLW0wE6N4GIlwu-HZxngiNrc4QYSPwSnIKDTGlfj5ETqFk23UhAnsaCxj6diL6wgs6eagA9gWm-wk6XnGyzm_Min9NPckROids/s320/HDInsightConvertCSVtoParquet_02.png" width="121" /></a></div>
<div>
<br /></div>
<div>
This will open a new tab/window.</div>
<div>
<br /></div>
<div>
Authenticate as the cluster administrator.</div>
<div>
<br /></div>
<div>
Create a new PySpark Notebook.</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgq2qHSG4xVTj2ELLTOLGsbRw3bcJjtbO-K1nUO7oQKKQbTbtM07DbW5c-hDJoUcODP4kQhTNRsVC7MBRN_nBh5dKvzwnZXbdgYOi0sTzVruB6JaLH6BlS2QcG-dj-feX0GwgtlI7kNlvW8/s1600/HDInsightConvertCSVtoParquet_03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="306" data-original-width="221" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgq2qHSG4xVTj2ELLTOLGsbRw3bcJjtbO-K1nUO7oQKKQbTbtM07DbW5c-hDJoUcODP4kQhTNRsVC7MBRN_nBh5dKvzwnZXbdgYOi0sTzVruB6JaLH6BlS2QcG-dj-feX0GwgtlI7kNlvW8/s200/HDInsightConvertCSVtoParquet_03.png" width="144" /></a></div>
<div>
<br /></div>
<div>
Paste the following lines and press Shift+Enter to run the cell.</div>
<div>
<br /></div>
<br />
<div>
<i>from pyspark.sql import *</i></div>
<div>
<i>from pyspark.sql.types import *</i></div>
<br />
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiOCdGURQdz4H4vUc41CAf9uPXu-RCOXbb1rFJbuYUGT4XRylNNo3I4gsvEz96DxqOQCj3e1Ye0pi_VdY-14bfq5kuX5G9GaCvm0Nb5h1TCccGwUKl3z_1n6KDNZ4zEP7cBYTLPdRF1ZySw/s1600/HDInsightConvertCSVtoParquet_04.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="103" data-original-width="1318" height="31" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiOCdGURQdz4H4vUc41CAf9uPXu-RCOXbb1rFJbuYUGT4XRylNNo3I4gsvEz96DxqOQCj3e1Ye0pi_VdY-14bfq5kuX5G9GaCvm0Nb5h1TCccGwUKl3z_1n6KDNZ4zEP7cBYTLPdRF1ZySw/s400/HDInsightConvertCSVtoParquet_04.png" width="400" /></a></div>
<div>
<br /></div>
<div>
Now we can import the CSV into a table. You will need to adjust the path to represent your storage account, container and file. The syntax of the storage path is <i>wasb://mycontainer@myaccount.blob.core.windows.net/foldername/filename</i></div>
<br />
<div>
<div>
<br /></div>
<div>
<i># import the COMPRESSED data</i></div>
<div>
<i>csvFile = spark.read.csv('wasb://dataload@mlbigdatastoracc.blob.core.windows.net/SalesSample_big.csv.gz', header=True, inferSchema=True)</i></div>
<div>
<i>csvFile.write.saveAsTable("salessample_big")</i></div>
</div>
<br />
<div>
<br /></div>
<div>
Press Shift+Enter to run the cell</div>
<div>
<br /></div>
<div>
Once complete you can use the SQL language to query the table you imported the data to. This will create a dataframe to host the output as we will use this to write the parquet file.</div>
<div>
<br /></div>
<div>
<div>
<i>dftable = spark.sql("SELECT * FROM salessample_big")</i></div>
<div>
<i>dftable.show()</i></div>
</div>
<div>
<br /></div>
<div>
The final step is to export the dataframe to a parquet file. We will also use the gzip compression.</div>
<div>
<br /></div>
<div>
<i>dftable.write.parquet('wasb://dataload@mlbigdatastoracc.blob.core.windows.net/SalesSample2.parquet',None, None , "gzip")</i></div>
<div>
<br /></div>
<div>
The complete Jupiter Notebook should look like:</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgsajdqLXDvRhTIDskj7R7dq2SKytbT52Ph6gY28XQ8IXzY1VXFq3NgnfzsEvrITctQkMtDa9zCkeOR7x5Sw8p_fUFc33NwUgSNyyCVCf7q_OJwbt_S9soUqMGEFLUl3J_rngTuE8iuyi2Q/s1600/HDInsightConvertCSVtoParquet_05.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="992" data-original-width="1131" height="280" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgsajdqLXDvRhTIDskj7R7dq2SKytbT52Ph6gY28XQ8IXzY1VXFq3NgnfzsEvrITctQkMtDa9zCkeOR7x5Sw8p_fUFc33NwUgSNyyCVCf7q_OJwbt_S9soUqMGEFLUl3J_rngTuE8iuyi2Q/s320/HDInsightConvertCSVtoParquet_05.png" width="320" /></a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
In your storage account you should have a Parquet export of the data (note that this format is not a single file as shown by the file, folder and child files in the following screen shots.</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEglevesREGowmif-Br9q8JW4MqMZGB8BP7Caf2N4mlTTPZ8FaaXdxqeMNH4mOFbv9_18I541Ky6I13oXh-nRBfdQINSKqA0qvMv2qODQviHA9iPcfmxByOSdWpoFZ9D_PspOz3m1CK2yKvl/s1600/HDInsightConvertCSVtoParquet_06.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="223" data-original-width="832" height="85" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEglevesREGowmif-Br9q8JW4MqMZGB8BP7Caf2N4mlTTPZ8FaaXdxqeMNH4mOFbv9_18I541Ky6I13oXh-nRBfdQINSKqA0qvMv2qODQviHA9iPcfmxByOSdWpoFZ9D_PspOz3m1CK2yKvl/s320/HDInsightConvertCSVtoParquet_06.png" width="320" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUP9NlKrDo_fw3enYUXeroSWCR1TT6TJONn2Of9GfZcIiMrrKlcA4N5eyvdTi7GM8ML35lk-scWTthYLkr5wHAkcWVHqV0OgUI2KDguCWXJVMb5ewnLZBYzhDbl1c8TLui4V5lLiuXGJhs/s1600/HDInsightConvertCSVtoParquet_07.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="406" data-original-width="1014" height="128" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUP9NlKrDo_fw3enYUXeroSWCR1TT6TJONn2Of9GfZcIiMrrKlcA4N5eyvdTi7GM8ML35lk-scWTthYLkr5wHAkcWVHqV0OgUI2KDguCWXJVMb5ewnLZBYzhDbl1c8TLui4V5lLiuXGJhs/s320/HDInsightConvertCSVtoParquet_07.png" width="320" /></a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
In this example you may notice that the compressed file sizes are not much different, yet the parquet file is slightly more efficient. You experience may vary as it depends on the content within the CSV file.</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
Some reference material worth checking out if this is something you are working on:</div>
<div>
<a href="https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-jupyter-spark-sql-use-portal">https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-jupyter-spark-sql-use-portal</a></div>
<div>
<a href="https://spark.apache.org/docs/latest/api/python/">https://spark.apache.org/docs/latest/api/python/</a></div>
<div>
<a href="https://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame">https://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame</a></div>
<div>
<br /></div>
<div>
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em></div>
<div>
<br /></div>
</div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-55554796023977256612018-04-21T11:14:00.000+10:002018-04-21T11:14:28.072+10:00New SqlOps Extensions<div dir="ltr" style="text-align: left;" trbidi="on">
Announcing my new <b>SqlOps Extensions</b><br />
<br />
<b>AlwaysOn Insights</b><br />
<a href="https://github.com/Matticusau/sqlops-alwayson-insights/releases">https://github.com/Matticusau/sqlops-alwayson-insights/releases</a><br />
<br />
<b>MSSQL Instance Insights</b><br />
<a href="https://github.com/Matticusau/sqlops-mssql-instance-insights/releases">https://github.com/Matticusau/sqlops-mssql-instance-insights/releases</a><br />
<br />
<b>MSSQL Db Insights</b><br />
<a href="https://github.com/Matticusau/sqlops-mssql-db-insights/releases">https://github.com/Matticusau/sqlops-mssql-db-insights/releases</a><br />
<br />
<br />
Previously I had released a single extension, which logically was actually three separate extensions but due to the methods used at the time it could be released as a single extension. With the added support for Extension Marketplace in SqlOps March release it made senses to break these out. Going forward this will also help with the life cycle of the extensions to manage them individually.<br />
<br />
If you want to know how I wrote these extensions check out the following posts by Kevin Cunnane<br />
<br />
Writing a SQL Operations Studio Extension in 15 minutes<br />
<a href="https://medium.com/@kevcunnane/writing-a-sql-operations-studio-extension-in-15-minutes-7dfd24a74dfe">https://medium.com/@kevcunnane/writing-a-sql-operations-studio-extension-in-15-minutes-7dfd24a74dfe</a><br />
<br />
Publishing an extension for SQL Operations Studio<br />
<a href="https://medium.com/@kevcunnane/publishing-an-extension-for-sql-operations-studio-f5a5b323c13b">https://medium.com/@kevcunnane/publishing-an-extension-for-sql-operations-studio-f5a5b323c13b</a><br />
<br />
<br /></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-31589523942104764412017-10-02T11:50:00.002+10:002017-10-02T11:50:38.210+10:00Troubleshooting SQL Server AlwaysOn AG Auto Seeding<div dir="ltr" style="text-align: left;" trbidi="on">
SQL Server 2016 introduced a new capability with AlwaysOn Availability Groups called Automatic Seeding. This allows you to automatically start streaming the backup using VDI across the log stream transport to the secondary replica. With this capability enabled you do not have to manually backup and restore the database and transaction logs before starting data synchronization with the primary replica.<br />
<br />
Now there are some prerequisites for using Automatic Seeding such as:<br />
<br />
<ul style="text-align: left;">
<li>Data and Log file paths must be the same on all Replicas</li>
<li>Databases must be in Full recovery model and at least one Full backup must have been taken of the database (to start the recovery chain)</li>
<li>All other prerequisites for a database to be added to an Availability Group must also be met</li>
</ul>
<div>
<br /></div>
<br />
You can enable Automatic Seeding either before or after creating the Availability Group.<br />
<br />
Sometimes you will find that Automatic Seeding doesn't work, in our example it will be because a data file already exists with the same name as the database we are seeding. Which is a common problem if you have previously removed the replica from the AG and are trying to re-join it. Unfortunately the UI doesn't give you any indication that seeding failed.<br />
<br />
Enter Extended Events.<br />
<br />
To demonstrate this I will create an Availability Group across two replicas. I am using the SQL Server AlwaysOn solution available from the Azure Market Place as this has an Availability Group already built, but for this demonstration I will create a new AG and Demo databases.<br />
<br />
<b>1. Setup the environment</b><br />
I have added an additional disk to the SQLSERVER-0 node, and formatted this new volume and assigned the drive letter G:. I also created the path G:\Log and set the file permissions to full control for the sql service account.<br />
<br />
<b>2. Create the Demo Database</b><br />
The following script can be used to setup the environment.<br />
<br />
[<a href="https://github.com/Matticusau/SQLDemos/blob/master/SQLAlwaysOnAGAutoSeeding/01.CreateDbs.sql" target="_blank">01.CreateDemoDbs.sql</a>]<br />
<br />
<b>3. Create the Availability Group</b><br />
The following script can be used to setup the environment.<br />
<br />
[<a href="https://github.com/Matticusau/SQLDemos/blob/master/SQLAlwaysOnAGAutoSeeding/02.CreateAG.sql" target="_blank">02.CreateAG.sql</a>]<br />
<br />
<br />
<b>4. Verify Environment</b><br />
Now you should have a working Availability Group with the AutoseedDb01 synchronized and healthy between two replicas.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiIG9I3pmR428M62Hzi1cUthpiuYFz6zyq5na9edhiBRJGjqMk0pg2cBdwtaIZCVTPsuMP327CrVstCh5RouA0GjVAJQQ4k-riNtazrUiFBIiZh7cw59tuiGiGd4gwReaFemt5hJBHZMyeL/s1600/AGAutoseed-01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="491" data-original-width="497" height="197" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiIG9I3pmR428M62Hzi1cUthpiuYFz6zyq5na9edhiBRJGjqMk0pg2cBdwtaIZCVTPsuMP327CrVstCh5RouA0GjVAJQQ4k-riNtazrUiFBIiZh7cw59tuiGiGd4gwReaFemt5hJBHZMyeL/s200/AGAutoseed-01.png" width="200" /></a></div>
<br />
<br />
<b>5. Create AlwaysOn_AutoseedMonitor Extended Events session</b><br />
While we enabled the standard AlwaysOn Health extended events session, run the following script to create a new Extended Events session for monitoring Autoseeding<br />
<br />
[<a href="https://github.com/Matticusau/SQLDemos/blob/master/SQLAlwaysOnAGAutoSeeding/03.ExtendedEventSession.sql" target="_blank">03.ExtendedEventsSession.sql</a>]<br />
<br />
If you want to create this manually or to explore what other events are available you will need to make sure you select the Debug channel when using the Wizard to select events.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgZhj1H2fa2VZCsBghv6_tJ0sHq9cq__Ki5ELfiiFmtGxmr2nr72qf4LZQCGQrbx2rnoZ7CKbR7Ud4cXGtto7TYgLZLb_QaHCd0MdWwbzh9CrVxRACyDqQ4wdN8nfdpkx4iDjEVGrBoKnzP/s1600/AGAutoseed-02-Xel.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="289" data-original-width="285" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgZhj1H2fa2VZCsBghv6_tJ0sHq9cq__Ki5ELfiiFmtGxmr2nr72qf4LZQCGQrbx2rnoZ7CKbR7Ud4cXGtto7TYgLZLb_QaHCd0MdWwbzh9CrVxRACyDqQ4wdN8nfdpkx4iDjEVGrBoKnzP/s200/AGAutoseed-02-Xel.png" width="197" /></a></div>
<br />
<br />
You can also filter the category to "dbseed" to further just view the events that relate to auto seeding<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjSh5lqZxRrIllY0BuSbO0LAYaGnH6XXeiEvvV94A1lvk5yEus4SJpGL-BGeqDaculOJZA8RkThmtoN-qC-TsdtgHI-BM04or0nmdG0d1CEv9Ox4NxjpK8nATf5GZ8OkkNGCwSzHOHk1Pf4/s1600/AGAutoseed-03-Xel.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="295" data-original-width="370" height="159" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjSh5lqZxRrIllY0BuSbO0LAYaGnH6XXeiEvvV94A1lvk5yEus4SJpGL-BGeqDaculOJZA8RkThmtoN-qC-TsdtgHI-BM04or0nmdG0d1CEv9Ox4NxjpK8nATf5GZ8OkkNGCwSzHOHk1Pf4/s200/AGAutoseed-03-Xel.png" width="200" /></a></div>
<br />
<br />
<b>6. Add the 2nd database to AG</b><br />
Now use the following script to add the 2nd database to the AG. NOTE: This DB has the log file on G:\ which does not exist on the replica.<br />
<br />
[<a href="https://github.com/Matticusau/SQLDemos/blob/master/SQLAlwaysOnAGAutoSeeding/04.AddDbToAG.sql" target="_blank">04.AddDbToAG.sql</a>]<br />
<br />
<b>7. Investigating the Auto Seeding</b><br />
You will notice that the DB was not created on the replica and it is listed in the AG's DBs but with a warning.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrTCz7wtQejbLi4CHQjLgQ5RziPCcAGPiLyMTa-C1SF1UIAhEe6nMQNPj3-VV1g8aPTs9FwObI-AHnXfb3kaarlP3r3iDqFUzXTUDskQYLjLqiyy34q1FR4DI8DWzS0mfvrNzyPjJq7S9H/s1600/AGAutoseed-04-AddDB.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="448" data-original-width="399" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrTCz7wtQejbLi4CHQjLgQ5RziPCcAGPiLyMTa-C1SF1UIAhEe6nMQNPj3-VV1g8aPTs9FwObI-AHnXfb3kaarlP3r3iDqFUzXTUDskQYLjLqiyy34q1FR4DI8DWzS0mfvrNzyPjJq7S9H/s200/AGAutoseed-04-AddDB.png" width="178" /></a></div>
<br />
<br />
If you open the AG Dashboard you will see the warning message as well.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgo2OnGLmAB0jD77ejpWtXQ0CeZliBh6OZpkxgB_azR8rWFeiiQEXKtVpZ3a9WE4qrY_gUvQ9800W76lYF0lRkDmpUbJZpVLdtdv-GOK-FhfqKs5nTl1tznJlId9M3Jt4Kpx9Du3Mrarbnu/s1600/AGAutoseed-05-AGDashboard.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="695" data-original-width="1230" height="179" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgo2OnGLmAB0jD77ejpWtXQ0CeZliBh6OZpkxgB_azR8rWFeiiQEXKtVpZ3a9WE4qrY_gUvQ9800W76lYF0lRkDmpUbJZpVLdtdv-GOK-FhfqKs5nTl1tznJlId9M3Jt4Kpx9Du3Mrarbnu/s320/AGAutoseed-05-AGDashboard.png" width="320" /></a></div>
<br />
<br />
The messages do not include much detail.<br />
<br />
8. Using DMVs<br />
We can use the following DMVs to query the status of the seeding.<br />
<br />
[<a href="https://github.com/Matticusau/SQLDemos/blob/master/SQLAlwaysOnAGAutoSeeding/05.DMVs.sql" target="_blank">05.DMVs.sql</a>]<br />
<br />
9. Using Extended Events<br />
We are capturing the "error_reported" event along with the <i>dbseed </i>events. If you view the Event File on the replica where the failure happened. Locate the hadr_physical_seeding_progress event. There will be a lot of these events. In the below screen shot you can see the progress event for one of the last stages of the seeding process where it reports the failure.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEig68mRSMPg-GBnibzhMy5tCddL0jJ2AuRAGVc4oUR1iukI2z0YoPhdh26wcnGcRn6pD_BlNGEz1MOKSK16XBpRGiFfbyoIQqOY67KwXvBINhZjft_jJZLML3e-jZHPZFsRmVkuVIhjWtvO/s1600/AGAutoseed-06-XelSeedingFailure.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="572" data-original-width="729" height="251" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEig68mRSMPg-GBnibzhMy5tCddL0jJ2AuRAGVc4oUR1iukI2z0YoPhdh26wcnGcRn6pD_BlNGEz1MOKSK16XBpRGiFfbyoIQqOY67KwXvBINhZjft_jJZLML3e-jZHPZFsRmVkuVIhjWtvO/s320/AGAutoseed-06-XelSeedingFailure.png" width="320" /></a></div>
<br />
<br />
Locate the <b><i>hadr_physical_seeding_progress </i></b>event with the internal_state_desc of "<b><i>WaitingForBackupToStartSending</i></b>". This is an early step in the auto seed process, then you should see <b><i>error_reported</i></b> events. Here is where you can find the real errors. Most likely it will be the first one reported. In our case it is:<br />
<i>Directory lookup for the file "G:\LOG\AutoseedDb02_log.ldf" failed with the operating system error 3(The system cannot find the path specified.).</i><br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhXdogxxhEmnwccCiN4vlI3aEe-d4Wq01xh-_CvdROMSbVHMcuo_4s0LeEskoWzAeYeFCRwMvJlQ2pdjzMWJ-L9SMdYeSuKym8Wa_4yOnuD_f0e73NiVHfUBOz2ytHrYsgfK9lBQAM4aRiP/s1600/AGAutoseed-07-XelSeedingFailureErrorMsg.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="523" data-original-width="656" height="255" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhXdogxxhEmnwccCiN4vlI3aEe-d4Wq01xh-_CvdROMSbVHMcuo_4s0LeEskoWzAeYeFCRwMvJlQ2pdjzMWJ-L9SMdYeSuKym8Wa_4yOnuD_f0e73NiVHfUBOz2ytHrYsgfK9lBQAM4aRiP/s320/AGAutoseed-07-XelSeedingFailureErrorMsg.png" width="320" /></a></div>
<br />
<br />
<br />
There will be many more cases where this is useful. I have also used it before when I had old data or log files on the server from previous versions of a database I was adding back into the AG as sometimes that operation will fail (depending on permissions).<br />
<br />
Let me know what situations you find<br />
<br />
<br />
All the scripts used in this post are located in my GitHub <a href="https://github.com/Matticusau/SQLDemos/tree/master/SQLAlwaysOnAGAutoSeeding" target="_blank">https://github.com/Matticusau/SQLDemos/tree/master/SQLAlwaysOnAGAutoSeeding</a><br />
<br />
<br />
<br />
Reference<br />
<br />
Automatic seeding for secondary replicas<br />
<a href="https://docs.microsoft.com/en-us/sql/database-engine/availability-groups/windows/automatic-seeding-secondary-replicas" target="_blank">https://docs.microsoft.com/en-us/sql/database-engine/availability-groups/windows/automatic-seeding-secondary-replicas</a><br />
<br />
Automatically initialize Always On Availability group<br />
<a href="https://docs.microsoft.com/en-us/sql/database-engine/availability-groups/windows/automatically-initialize-always-on-availability-group" target="_blank">https://docs.microsoft.com/en-us/sql/database-engine/availability-groups/windows/automatically-initialize-always-on-availability-group</a><br />
<br />
<br />
<br />
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em><br />
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;"><br /></em>
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;"><br /></em></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-39483665430266427162017-09-27T12:34:00.001+10:002017-09-27T13:23:25.612+10:00Checklist for troubleshooting compilation of DSC Configuration in Azure Automation<div dir="ltr" style="text-align: left;" trbidi="on">
I was recently working on a solution where a DSC Configuration block was not compiling in Azure Automation. This solution included Node Data, DSC Resources, and custom Composite Resources. The configuration would compile perfectly fine locally but not when published to Azure Automation. The other challenging aspect was that other composite resources within the same module were compiling fine.<br />
<br />
Unfortunately Azure Automation doesn’t provide very detailed information for troubleshooting DSC Compilation errors. It actually will only show you the first level failure, which when your using a DSC Composite Resource it means you will simply receive an error that the composite resource failed to import, but the actual cause could be related to a problem with an underlying resource used by that composite resource.<br />
<br />
So based on my experience I have come up with the following troubleshooting checklist when working through DSC Compilation errors in Azure Automation.<br />
<br />
<strong><u>Troubleshooting Checklist</u></strong><br />
<ol style="text-align: left;">
<li>Check the exception details output by the DSC Configuration compilation job that is in the suspended state. Either within the Azure Portal like the following screen shot or via PowerShell by the Get-AzureRmAutomationDscCompilationJobOutput cmdlet.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQv8DsA_oO6ISXIn629F7nq1o5lshDgTid760010FcTcP10rLW3m6ixDEk41LjdoYDRQ9LS2OVKJsz-nu8Dj2qZKbHwCie2bTrokegcgCJyd9aepaefrNMzxCAhWlXsCKAAdB1wBoXbcCt/s1600/AzureAutomationDSCCompiliationException.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" data-original-height="965" data-original-width="1400" height="220" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQv8DsA_oO6ISXIn629F7nq1o5lshDgTid760010FcTcP10rLW3m6ixDEk41LjdoYDRQ9LS2OVKJsz-nu8Dj2qZKbHwCie2bTrokegcgCJyd9aepaefrNMzxCAhWlXsCKAAdB1wBoXbcCt/s320/AzureAutomationDSCCompiliationException.png" width="320" /></a><br /><br />Depending on the exception reported the next steps may vary. In the above screenshot it is reporting that a Composite Resource has failed to import.</li>
<li>Can you compile the configuration locally?</li>
<ol>
<li>If yes, can you upload the MOF to the DSC Node Configurations in the Azure Automation account?</li>
</ol>
<li>Are all required Modules referenced by your Configuration(s):</li>
<ol>
<li>Uploaded to your Azure Automation account</li>
<li>Up-to-date (see next point though)</li>
<li>Match the required version by your Configuration block or Composite Resource</li>
</ol>
<li>If it is a Composite resource that is failing, are all Composite resources within your module affected or is it just a subset?</li>
<li>If it is a Composite Resource, extract the configuration from the failing Composite Resource and place it directly in a Configuration block. Compile that configuration block in Azure Automation and review the output as this will provide more granular details about the specific resources used by that configuration block.</li>
<li>Try simplifying the DSC Configuration block to reduce the number of DSC Composite resources or other resources being compiled to help narrow down the culprit</li>
</ol>
<div>
<br /></div>
<div>
You should also read the "Common errors when working with Desired State Configuration (DSC)" section in the official documentation <a href="https://docs.microsoft.com/en-us/azure/automation/automation-troubleshooting-automation-errors" target="_blank">https://docs.microsoft.com/en-us/azure/automation/automation-troubleshooting-automation-errors</a></div>
<div>
<br /></div>
<br />
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-51272559504682067202017-08-14T00:59:00.000+10:002017-08-14T01:06:48.858+10:00Taking the Availability Group Database Level Health Detection option for a spin<div dir="ltr" style="text-align: left;" trbidi="on">
With SQL Server 2016 the Availability Groups now include the option to enable Database Level Health Detection. While this option is turned off by default, Microsoft strongly recommends you enable this on all Availability Groups.... it is just off by default for backwards compatibility and so you opt-in by choice.<br />
<br />
In this post I will take a deep look into just what sort of database issues will cause the AG to failover with this option enabled.<br />
<br />
For this walk through I am using the SQL Server AlwaysOn deployment in the Azure Market Place and have provided a link to all the scripts below so you can try out this functionality and show off your skills in your own demonstrations. If you don't have an Azure Subscription all you need is an environment with two Replicas and multiple disks mounted in the virtual machines (so you can separate the transaction logs of different databases).<br />
<br />
To follow along with the scripts and screen shots here are the details of resources within my test lab.<br />
<br />
Primary Replica: SQLSERVER-0<br />
Secondary Replica: SQLSERVER-1<br />
File Share Witness: CLUSTER-FSW<br />
<br />
Existing Availability Group: Contoso-ag<br />
AG Database: AutoHa-sample<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiD4wdDouQsYVRCzu5sAgueFH-U3SB9qIEFaK1ENG-A4yMH4Hdz9nVe-oQ-piuYwBWBxNXCf4T7GYlMTMkzNpctK21GwSTowcGOpIckMw3iWs4wuycbT2H2WpXQCJoIp-93JFmlg-1Rs6l8/s1600/001-ObjectExplorer.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1344" data-original-width="785" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiD4wdDouQsYVRCzu5sAgueFH-U3SB9qIEFaK1ENG-A4yMH4Hdz9nVe-oQ-piuYwBWBxNXCf4T7GYlMTMkzNpctK21GwSTowcGOpIckMw3iWs4wuycbT2H2WpXQCJoIp-93JFmlg-1Rs6l8/s200/001-ObjectExplorer.png" width="116" /></a></div>
<br />
<br />
To make this demo easier, I have created a file share on the File Share Witness to store backups. It is best practice that when using Availability Groups you use a central location for storing backups from all replicas. Obviously a File Share Witness is not that place but for this demo lab it is fine.<br />
<br />
Central Backup Share: \\cluster-fsw\sqlbackups<br />
<br />
<br />
All the scripts used in this post are located <a href="https://github.com/Matticusau/SQLDemos/tree/master/SQL2016AGDBHealthLevelDetection" target="_blank">here</a>. This link is also contained within my final thoughts at the end of this post.<br />
<br />
Ok, lets get started.<br />
<br />
<br />
<b>1. Create the demo databases</b><br />
To enhance this demo, we will setup new databases and an availability group to show that how this setting only impacts one AG and not another.<br />
<br />
Run the following TSQL to setup the demo databases. NOTE: I have explicitly put the Transaction Logs for the SuspectDb on a different volume to the CorruptDb<br />
<br />
[01.CreateDbs.sql]<br />
<pre class="brush:sql">--- YOU MUST EXECUTE THE FOLLOWING SCRIPT IN SQLCMD MODE.
:Connect SQLSERVER-0
USE master
GO
-- CorruptDb
CREATE DATABASE [CorruptDb] CONTAINMENT = NONE
ON PRIMARY (NAME = N'CorruptDb_data', FILENAME = N'F:\DATA\CorruptDb_data.mdf' , SIZE = 8192KB , FILEGROWTH = 65536KB )
LOG ON ( NAME = N'CorruptDb_log', FILENAME = N'F:\LOG\CorruptDb_log.ldf' , SIZE = 8192KB , FILEGROWTH = 65536KB )
GO
ALTER DATABASE [CorruptDb] SET COMPATIBILITY_LEVEL = 130
GO
ALTER DATABASE [CorruptDb] SET PAGE_VERIFY CHECKSUM
GO
USE [CorruptDb]
GO
CREATE TABLE [dbo].[DemoData] (id INT IDENTITY PRIMARY KEY, demoData VARCHAR(200));
GO
INSERT INTO [dbo].[DemoData] (demoData) VALUES ('Test data prior to simulating page level corruption');
GO 200
-- SuspectDb (NOTE: G:\ for TransLog)
CREATE DATABASE [SuspectDb] CONTAINMENT = NONE
ON PRIMARY (NAME = N'SuspectDb_data', FILENAME = N'F:\DATA\SuspectDb_data.mdf' , SIZE = 8192KB , FILEGROWTH = 65536KB )
LOG ON ( NAME = N'SuspectDb_log', FILENAME = N'G:\LOG\SuspectDb_log.ldf' , SIZE = 8192KB , FILEGROWTH = 65536KB )
GO
ALTER DATABASE [SuspectDb] SET COMPATIBILITY_LEVEL = 130
GO
ALTER DATABASE [SuspectDb] SET PAGE_VERIFY CHECKSUM
GO
USE [SuspectDb]
GO
CREATE TABLE [dbo].[DemoData] (id INT IDENTITY PRIMARY KEY, demoData VARCHAR(200));
GO
INSERT INTO [dbo].[DemoData] (demoData) VALUES ('Test data prior to making the database suspect');
GO 200
</pre>
<br />
<b>2, Create the demo Availability Group </b><br />
We won't actually add our corrupt database into this Availability Group yet, that will come in the next few steps.<br />
<br />
[02.CreateAG.sql]<br />
<pre class="brush:sql">
--- YOU MUST EXECUTE THE FOLLOWING SCRIPT IN SQLCMD MODE.
-- Values you may need to change for your lab
-- Primary replica: SQLSERVER-0
-- Secondary replica: SQLSERVER-1
-- Backup Central Share: \\CLUSTER-FSW\SQLBACKUPS
---------------------------------------------------------
-- Prereqs for Availability Group
---------------------------------------------------------
:Connect SQLSERVER-0
-- Backup demo databases to share \\CLUSTER-FSW\SQLBACKUPS
BACKUP DATABASE [CorruptDb] TO DISK = N'\\CLUSTER-FSW\SQLBACKUPS\CorruptDb.bak' WITH FORMAT;
BACKUP DATABASE [SuspectDb] TO DISK = N'\\CLUSTER-FSW\SQLBACKUPS\SuspectDb.bak' WITH FORMAT;
GO
---------------------------------------------------------
-- AG Endpoints
---------------------------------------------------------
-- If you are using your own lab then you need to create a Database Mirroring end-point
-- The Azure Market Place lab already has the end-point created for you, check these with
:Connect SQLSERVER-0
SELECT * FROM sys.database_mirroring_endpoints;
GO
:Connect SQLSERVER-1
SELECT * FROM sys.database_mirroring_endpoints;
GO
-- If you need to create an endpoint use the following syntax
--CREATE ENDPOINT hadr_endpoint
-- STATE=STARTED
-- AS TCP (LISTENER_PORT=5022)
-- FOR DATABASE_MIRRORING (ROLE=ALL);
--GO
---------------------------------------------------------
-- Start the AlwaysOn Extended Events Session
---------------------------------------------------------
-- On the primary replica
:Connect SQLSERVER-0
IF EXISTS(SELECT * FROM sys.server_event_sessions WHERE name='AlwaysOn_health')
BEGIN
ALTER EVENT SESSION [AlwaysOn_health] ON SERVER WITH (STARTUP_STATE=ON);
END
IF NOT EXISTS(SELECT * FROM sys.dm_xe_sessions WHERE name='AlwaysOn_health')
BEGIN
ALTER EVENT SESSION [AlwaysOn_health] ON SERVER STATE=START;
END
GO
-- On the secondary replica
:Connect SQLSERVER-1
IF EXISTS(SELECT * FROM sys.server_event_sessions WHERE name='AlwaysOn_health')
BEGIN
ALTER EVENT SESSION [AlwaysOn_health] ON SERVER WITH (STARTUP_STATE=ON);
END
IF NOT EXISTS(SELECT * FROM sys.dm_xe_sessions WHERE name='AlwaysOn_health')
BEGIN
ALTER EVENT SESSION [AlwaysOn_health] ON SERVER STATE=START;
END
GO
---------------------------------------------------------
-- Create the AG on the primary
---------------------------------------------------------
:Connect SQLSERVER-0
USE [master];
CREATE AVAILABILITY GROUP DbHealthOptDemoAg
WITH (DB_FAILOVER = ON)
FOR DATABASE SuspectDb
REPLICA ON
'SQLSERVER-0' WITH
(
ENDPOINT_URL = 'TCP://SQLSERVER-0.Contoso.com:5022',
AVAILABILITY_MODE = SYNCHRONOUS_COMMIT,
FAILOVER_MODE = AUTOMATIC,
SEEDING_MODE = AUTOMATIC
),
'SQLSERVER-1' WITH
(
ENDPOINT_URL = 'TCP://SQLSERVER-1.Contoso.com:5022',
AVAILABILITY_MODE = SYNCHRONOUS_COMMIT,
FAILOVER_MODE = AUTOMATIC,
SEEDING_MODE = AUTOMATIC
);
GO
---------------------------------------------------------
-- Join the secondary replica to the AG (allow for seeding)
---------------------------------------------------------
:Connect SQLSERVER-1
ALTER AVAILABILITY GROUP DbHealthOptDemoAg JOIN;
GO
ALTER AVAILABILITY GROUP DbHealthOptDemoAg GRANT CREATE ANY DATABASE;
GO
</pre>
<br />
After that step you should have an Availability Group that looks like this:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5hYGh5SKCFhAWV6qeq06jdRvcToyZADvDLGZxGnkqRm34Fyjt5Qh3TMyRhjaO1ANPsJvqAAlTin6kTGqI-9-42_7KWoO3UjPB0jp_emq5c74lApysBxauCjlzt1vPTRUefV8PQXqBX8Ux/s1600/002-DBHealthOptDemoAg-PostSetup.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="370" data-original-width="630" height="116" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5hYGh5SKCFhAWV6qeq06jdRvcToyZADvDLGZxGnkqRm34Fyjt5Qh3TMyRhjaO1ANPsJvqAAlTin6kTGqI-9-42_7KWoO3UjPB0jp_emq5c74lApysBxauCjlzt1vPTRUefV8PQXqBX8Ux/s200/002-DBHealthOptDemoAg-PostSetup.png" width="200" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
On your primary replica you should have the following databases</div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiMOykmuHCpQ9h_iO0iubtaiQ-95Z51GIOAhNNiFuxKIJ48duQ9f3HsvHD_DHILQjRlRidcf2hXXi0osgZOqB435qt7KmFYRRs3_96ImSXmqfM003mOQFRX2XMDRE-VsTChQaq1C8exTH41/s1600/003-PostSetup-DBsOnPrimary.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="257" data-original-width="709" height="71" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiMOykmuHCpQ9h_iO0iubtaiQ-95Z51GIOAhNNiFuxKIJ48duQ9f3HsvHD_DHILQjRlRidcf2hXXi0osgZOqB435qt7KmFYRRs3_96ImSXmqfM003mOQFRX2XMDRE-VsTChQaq1C8exTH41/s200/003-PostSetup-DBsOnPrimary.png" width="200" /></a></div>
<br />
On the secondary replica you should only have the SuspectDb for now. We still need to do some work to setup the CorruptDb.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0u82x5uHSDxjBwA9S5Ziw2s0CsREAh8X2j2jx84qyWgI0qd18KfI06GnplakSHjBApEFism41CkCh6Xgr1gBLgzcvmprOMNl-LdgelQNXq4AMMzzdz8jPoiv1hQ-4PoqioGsl_6Psu7is/s1600/004-PostSetup-DBsOnSecondary.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="227" data-original-width="709" height="63" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0u82x5uHSDxjBwA9S5Ziw2s0CsREAh8X2j2jx84qyWgI0qd18KfI06GnplakSHjBApEFism41CkCh6Xgr1gBLgzcvmprOMNl-LdgelQNXq4AMMzzdz8jPoiv1hQ-4PoqioGsl_6Psu7is/s200/004-PostSetup-DBsOnSecondary.png" width="200" /></a></div>
<br />
If you do not have the SuspectDb on the secondary replica after joining the replica to the AG then the Automatic Seeding option may not have succeeded. The script provided contains the required steps to restore and join the db on the seconary replica.<br />
<br />
<b><br /></b>
<b><br /></b>
<b>3. Verify Database Level Health Detection option is enabled</b><br />
Open the properties of the AG and make sure the option is checked.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh8igc3_sQ5lVwX8Dc7-TVwILHpHD195rgKMUpZNXa6nx5M8RWWcAPsA_TladRQoYBNvmDOWdRUeL_ifPFXYB3D4WE2oLCfscbOJB4r7fp8Sc9t89Q5Fbuj4qyXugizLEKrUDtFYc_PrzCd/s1600/005-AGDBHealthLevelDetectionEnabled.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="566" data-original-width="1410" height="128" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh8igc3_sQ5lVwX8Dc7-TVwILHpHD195rgKMUpZNXa6nx5M8RWWcAPsA_TladRQoYBNvmDOWdRUeL_ifPFXYB3D4WE2oLCfscbOJB4r7fp8Sc9t89Q5Fbuj4qyXugizLEKrUDtFYc_PrzCd/s320/005-AGDBHealthLevelDetectionEnabled.png" width="320" /></a></div>
<br />
<br />
Alternatively check the value of db_failover is 1 in sys.availability_groups
[04.VerifyAg.EnableDBHealthLevelDetection.sql]
<br />
<pre class="brush:sql">--- YOU MUST EXECUTE THE FOLLOWING SCRIPT IN SQLCMD MODE.
:Connect SQLSERVER-0
USE [master]
GO
SELECT name, db_failover FROM sys.availability_groups
GO
</pre>
<br />
If you need to enable the setting either use the GUI or run 03.AlterAG.EnableDBHealthLevelDetection.sql<br />
<pre class="brush:sql">--- YOU MUST EXECUTE THE FOLLOWING SCRIPT IN SQLCMD MODE.
:Connect SQLSERVER-0
USE [master]
GO
ALTER AVAILABILITY GROUP [DbHealthOptDemoAg] SET(DB_FAILOVER = ON);
GO
</pre>
<br />
<div style="text-align: left;">
<b><br /></b><b>4. Corrupt the demo database</b></div>
The first thing we are going to look at is what happens when you have page level corruption, and is corruption really corruption. Full credit to Paul Randle for an existing blog on how to do this. See https://www.sqlskills.com/blogs/paul/dbcc-writepage/<br />
<br />
Because we are going to use DBCC WRITEPAGE and need the database to be in single user mode, we have not yet been able to add it to the Availability Group. This won't impact the demo though, as when a database is brought online or added to an availability group the integrity of data pages are not validated.<br />
<br />
Run the following statements to corrupt the database. WARNING!!! Use this at your own risk and never ever ever ever do this on a production environment.<br />
<br />
[05.CorruptData.sql]<br />
<pre class="brush:sql">-- get the page information of our demo database and table
:Connect SQLSERVER-0
DBCC IND (N'CorruptDb', N'DemoData', -1);
GO
-- Corrup the page data (remember to change the page number)
:Connect SQLSERVER-0
ALTER DATABASE [CorruptDb] SET SINGLE_USER;
GO
DBCC WRITEPAGE (N'CorruptDb', 1, [PageId], 4000, 1, 0x45, 1);
GO
ALTER DATABASE [CorruptDb] SET MULTI_USER;
GO
-- check that we have caused page level corruption, cause a data check
:Connect SQLSERVER-0
DBCC DROPCLEANBUFFERS
GO
USE [CorruptDb]
GO
SELECT * FROM [dbo].[DemoData];
GO
</pre>
<br />
If this goes to plan then you should receive this error:<br />
<br />
<i><span style="color: red;">Msg 824, Level 24, State 2, Line 27</span></i><br />
<i><span style="color: red;">SQL Server detected a logical consistency-based I/O error: incorrect checksum (expected: 0x3ea8609e; actual: 0x3ea8259e). It occurred during a read of page (1:320) in database ID 6 at offset 0x00000000280000 in file 'F:\DATA\CorruptDb_data.mdf'. Additional messages in the SQL Server error log or system event log may provide more detail. This is a severe error condition that threatens database integrity and must be corrected immediately. Complete a full database consistency check (DBCC CHECKDB). This error can be caused by many factors; for more information, see SQL Server Books Online.</span></i><br />
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgKApTIBMbLUKGNbK5zMHt4yyXL8G6Gaesfo2BIhFSX8FmvR7w-4V7QctHcSpqP7KSU1sQP3fNXDnipq_bD4QrZ8A3FrMpDPcLoOxIrcIMpDRUxclnSjxNWw85pTp_ipoGOZoAX6awJ1kjs/s1600/006-LogicalConsistencyIOError.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="206" data-original-width="1600" height="41" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgKApTIBMbLUKGNbK5zMHt4yyXL8G6Gaesfo2BIhFSX8FmvR7w-4V7QctHcSpqP7KSU1sQP3fNXDnipq_bD4QrZ8A3FrMpDPcLoOxIrcIMpDRUxclnSjxNWw85pTp_ipoGOZoAX6awJ1kjs/s320/006-LogicalConsistencyIOError.png" width="320" /></a></div>
<br />
<br />
<br />
<b>5. Add the database into the Availability Groups</b><br />
Now that we have a corrupted database, lets add it into the availability group using the following statements<br />
<br />
[06.AddCorruptDbToAG.sql]<br />
<pre class="brush:sql">--- YOU MUST EXECUTE THE FOLLOWING SCRIPT IN SQLCMD MODE.
-- Add the database to the AG on the Primary
:Connect SQLSERVER-0
USE [master]
GO
ALTER AVAILABILITY GROUP [DbHealthOptDemoAg]
MODIFY REPLICA ON N'SQLSERVER-1' WITH (SEEDING_MODE = AUTOMATIC)
GO
USE [master]
GO
ALTER AVAILABILITY GROUP [DbHealthOptDemoAg]
ADD DATABASE [CorruptDb];
GO
-- Make sure the secondary is set to Auto Seed with CREATE DB permissions
:Connect SQLSERVER-1
ALTER AVAILABILITY GROUP [DbHealthOptDemoAg] GRANT CREATE ANY DATABASE;
GO
-- if auto seeding doesn't automatically work, check the logs as if you haven't cleaned up
-- the data and log files from previous demos they may prevent the auto seeding.
-- The following statements can be used if AUTO Seeding doesn't run (once the issue is resolved)
:Connect SQLSERVER-0
BACKUP DATABASE [CorruptDb] TO DISK = N'\\CLUSTER-FSW\SQLBACKUPS\CorruptDb_addtoAG.bak' WITH FORMAT;
GO
:Connect SQLSERVER-1
RESTORE DATABASE [CorruptDb] FROM DISK = N'\\CLUSTER-FSW\SQLBACKUPS\CorruptDb_addtoAG.bak' WITH NORECOVERY;
GO
:Connect SQLSERVER-0
BACKUP LOG [CorruptDb] TO DISK = N'\\CLUSTER-FSW\SQLBACKUPS\CorruptDb_addtoAG.trn' WITH FORMAT;
GO
:Connect SQLSERVER-1
RESTORE LOG [CorruptDb] FROM DISK = N'\\CLUSTER-FSW\SQLBACKUPS\CorruptDb_addtoAG.trn' WITH NORECOVERY;
GO
ALTER DATABASE [CorruptDb] SET HADR AVAILABILITY GROUP = [DbHealthOptDemoAg];
GO
</pre>
<br />
<b>6. Reviewing the behavior of a corrupted database in Availability Groups</b><br />
Now you should have both databases in the Availability Group. Open the Availability Group Dashboard and take note that everything is in a health state.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgkRhBzbjzQj9wzl1VaSqQiHunVlFqJLhVG-IoqYCLtCeVJRO1VZX5zeMIw-eekcZAQZYsD4c8H-GejOJMj9YnaQ8uVSo64whGQKcGvcAzFYPKGlcnr3ndShfuu9qqwjfG4MaC247bZt6VM/s1600/007-AGDashboardHealthy.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="869" data-original-width="1600" height="173" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgkRhBzbjzQj9wzl1VaSqQiHunVlFqJLhVG-IoqYCLtCeVJRO1VZX5zeMIw-eekcZAQZYsD4c8H-GejOJMj9YnaQ8uVSo64whGQKcGvcAzFYPKGlcnr3ndShfuu9qqwjfG4MaC247bZt6VM/s320/007-AGDashboardHealthy.png" width="320" /></a></div>
<br />
.... but wait. Didn't we turn on DB health level detection for the AG and corrupt the database. Run a statement to force SQL Server to generate the Logical consistency I/O error like before<br />
<br />
[07.GenerateLogicalConsistencyError.sql]<br />
<pre class="brush:sql">--- YOU MUST EXECUTE THE FOLLOWING SCRIPT IN SQLCMD MODE.
-- Generate the Logical Consistency check error
:Connect SQLSERVER-0
DBCC DROPCLEANBUFFERS
GO
USE [CorruptDb]
GO
SELECT * FROM [dbo].[DemoData];
GO
</pre>
<br />
But did this cause a failover. Check the state of the Availability Group either within Object Explorer or within the dashboard.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgkRhBzbjzQj9wzl1VaSqQiHunVlFqJLhVG-IoqYCLtCeVJRO1VZX5zeMIw-eekcZAQZYsD4c8H-GejOJMj9YnaQ8uVSo64whGQKcGvcAzFYPKGlcnr3ndShfuu9qqwjfG4MaC247bZt6VM/s1600/007-AGDashboardHealthy.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" data-original-height="869" data-original-width="1600" height="173" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgkRhBzbjzQj9wzl1VaSqQiHunVlFqJLhVG-IoqYCLtCeVJRO1VZX5zeMIw-eekcZAQZYsD4c8H-GejOJMj9YnaQ8uVSo64whGQKcGvcAzFYPKGlcnr3ndShfuu9qqwjfG4MaC247bZt6VM/s320/007-AGDashboardHealthy.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<br />
Notice that this didn't cause a failover. This is because this is not enough to affect the database state. If this database was not in an Availability Group it would still remain in an online state (as it was when we corrupted it). The Database Health Level Detection only triggers when an event is significant enough to affect the database state (e.g. Offline, Suspect, etc). To quote the official <a href="https://docs.microsoft.com/en-us/sql/database-engine/availability-groups/windows/sql-server-always-on-database-health-detection-failover-option" target="_blank">documentation </a>"<i>The database level health detection notices when a database is no longer in the online status</i>".<br />
<br />
So the lesson at this point is that database corruption, at least page level corruption is not enough to trigger the failover. Even if you run a DBCC CHECKDB it won't trigger the failover (HINT: give that a go for yourself). So the recommendation here is that you should be running regular DBCC CHECKDB on secondaries and failing over frequently (thus running consistency checks on all replicas regularly).<br />
<br />
<b>7. Simulate Suspect database state</b><br />
So now lets look at what happens when something significant happens to affect the database status. The easiest way to simulate this would be to use the ALTER DATABASE statement, however when a database is added to an Availability Group there are many statements that the database engine will not allow you to execute to protect the state of the database.<br />
<br />
So the way we can replicate this is to simulate an I/O failure.<br />
<br />
Remember when we created the SuspectDb on SQLSERVER-0 I set the Transaction Log's path to a different disk/partition to the other databases.<br />
<br />
Open Disk Management on the primary node SQLSERVER-0<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj45xDTNxrfnaybPdhjkPypuwtyVeTNKFdIxIYinet4Bc28JxLtmYzzVHMYqkVatfgBKtNIW8Vcp-c1gM-lEo9yOGssv5HHUveapGuC3kD0_ms-93Qa2j4NLOOpjCY94mihkwSbdLnTi3d7/s1600/008-DiskManagement.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="901" data-original-width="1285" height="224" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj45xDTNxrfnaybPdhjkPypuwtyVeTNKFdIxIYinet4Bc28JxLtmYzzVHMYqkVatfgBKtNIW8Vcp-c1gM-lEo9yOGssv5HHUveapGuC3kD0_ms-93Qa2j4NLOOpjCY94mihkwSbdLnTi3d7/s320/008-DiskManagement.png" width="320" /></a></div>
<br />
Locate the Disk hosting the G:\ volume, or whatever volume you have stored the SuspectDb Transaction Log file. Right click the disk and select <b>Offline</b>.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhOiJT4zs3tT7VywpD3kvpGySYMG1LXhI7y4XvqDT47zZWhkSfniDRwGTTVpItidy2-hfu35ISZAJaEJIDr95eKeO7XKpmqkMOi89Wp2V_lYOBrPWneVeySENuctKBB-FjlRrQamQfM3k0x/s1600/009-DiskOffline.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="536" data-original-width="598" height="178" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhOiJT4zs3tT7VywpD3kvpGySYMG1LXhI7y4XvqDT47zZWhkSfniDRwGTTVpItidy2-hfu35ISZAJaEJIDr95eKeO7XKpmqkMOi89Wp2V_lYOBrPWneVeySENuctKBB-FjlRrQamQfM3k0x/s200/009-DiskOffline.png" width="200" /></a></div>
<br />
In SQL Management Studio, refresh the Availability Groups branch of Object Explorer for SQLSERVER-0. Notice that it is still the primary replica.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjug4eRdj3SWUFIM4CTIUZ5fkqyrEF1DEJb0yfae5LcQuUj3AemOI1rewPOeJRtHNPb37H3KJ2I93SkxVrggCObWV24HBgC-oG74CaKNPvd9PiKEz4yeduYvfHDgMTVPsutdzGHiK21TXE0/s1600/010-AGHasntFailedOverYet.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="647" data-original-width="716" height="180" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjug4eRdj3SWUFIM4CTIUZ5fkqyrEF1DEJb0yfae5LcQuUj3AemOI1rewPOeJRtHNPb37H3KJ2I93SkxVrggCObWV24HBgC-oG74CaKNPvd9PiKEz4yeduYvfHDgMTVPsutdzGHiK21TXE0/s200/010-AGHasntFailedOverYet.png" width="200" /></a></div>
<br />
<br />
At this stage we haven't caused an I/O operation (in our lab there is no Maintenance Tasks to backup the Transaction Logs). So lets insert some data to cause the Transaction Log to be accessed.<br />
<br />
[08.WriteToSuspectDbTransLog.sql]<br />
<pre class="brush:sql">--- YOU MUST EXECUTE THE FOLLOWING SCRIPT IN SQLCMD MODE.
:Connect SQLSERVER-0
USE [SuspectDb]
GO
INSERT INTO [dbo].[DemoData] (demoData) VALUES ('Data to push through the TransLog');
GO
</pre>
<br />
Bingo we generated an I/O which is significant enough to affect the database status.<br />
<br />
<i><span style="color: red;">Msg 945, Level 14, State 2, Line 3</span></i><br />
<i><span style="color: red;">Database 'SuspectDb' cannot be opened due to inaccessible files or insufficient memory or disk space. See the SQL Server errorlog for details.</span></i><br />
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjjMz8kbiuFsWjNhiFz9HPwuIuh-_GlJ_3qEI0b3EGXK36C7ghLYUkfQ6bVGnLO2GdJ6uZjlHvktCCLFm6pNIA9tkatGxcaXCRw2hPw6d5jZ1GJnLCEeLFjuFkOoEjF15ENjB4YBNTF98BI/s1600/011-TransLogIOError.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="180" data-original-width="1600" height="36" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjjMz8kbiuFsWjNhiFz9HPwuIuh-_GlJ_3qEI0b3EGXK36C7ghLYUkfQ6bVGnLO2GdJ6uZjlHvktCCLFm6pNIA9tkatGxcaXCRw2hPw6d5jZ1GJnLCEeLFjuFkOoEjF15ENjB4YBNTF98BI/s320/011-TransLogIOError.png" width="320" /></a></div>
<div>
<br /></div>
<div>
But what happened to the Availability Group. Refresh the Object Explorer for SQLSERVER-0, and notice the status of the database and the Availability Group role.</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhr5IKlJw6b-Ok3SM5uI-FswAk0hSm82yBBnGa1aytwv1c4Ud3h_3N8NTJhZhc80-mIw1yZDHjNcNB0d6B4rmsA1G182uGuNUyu6vrKln-D3YwyEf8ts_T2ix807z9_b18B3wR_7l40ZJak/s1600/012-SuspectDbAndAGFailedOver.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="657" data-original-width="741" height="176" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhr5IKlJw6b-Ok3SM5uI-FswAk0hSm82yBBnGa1aytwv1c4Ud3h_3N8NTJhZhc80-mIw1yZDHjNcNB0d6B4rmsA1G182uGuNUyu6vrKln-D3YwyEf8ts_T2ix807z9_b18B3wR_7l40ZJak/s200/012-SuspectDbAndAGFailedOver.png" width="200" /></a></div>
<div>
<br /></div>
<div>
So it looks like we failed over, but only for the DBHealthOptAg, because that was the only AG where a database status changed. Let's check the state of our Availability Group though post failover. Switch to SQLSERVER-1 in Object Explorer and expand the Availability Groups. Open the Dashboard for the Availability Group.</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhBHTmLflDOKpiOMqu6XizqD6MJEOpgV_0BOpYDoCu4wruyMuPSAwpFV6iAJOCgt8_kh8VwCNdUlVeEblzoz76cC4vwsu1aT4WQNfbZInZbCZ-7JNqIvVc73S7PsTrDqjUSiriprwtjoYWJ/s1600/013-AGDashboardSuspectDbFailover.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="647" data-original-width="1600" height="129" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhBHTmLflDOKpiOMqu6XizqD6MJEOpgV_0BOpYDoCu4wruyMuPSAwpFV6iAJOCgt8_kh8VwCNdUlVeEblzoz76cC4vwsu1aT4WQNfbZInZbCZ-7JNqIvVc73S7PsTrDqjUSiriprwtjoYWJ/s320/013-AGDashboardSuspectDbFailover.png" width="320" /></a></div>
<div>
<br /></div>
<div>
This confirms we have failed over with SQLSERVER-1 now the primary replica due to the I/O error experienced on SQLSERVER-0 due to a storage sub-system error (loss of disk). The DB Health Level Detection setting worked. Without that setting enabled this would not have caused a failover.</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<b><u>My final thoughts</u></b></div>
The Database Level Health Check setting is a great new capability for increasing the availability of your databases contained in an Availability Group when one of the databases experiences a significant enough issue to affect the status of the database, like a loss of disk. It still will not protect you from certain data issues like corruption. So monitoring and maintenance is still critical!<br />
<br />
Finally, you should also consider the <a href="https://docs.microsoft.com/en-us/sql/database-engine/availability-groups/windows/configure-flexible-automatic-failover-policy" target="_blank">Flexible Automatic Failover Policy</a>. While I do not recommend changing this blindly, you should ensure you understand what role that setting plays in Automatic Failover.<br />
<br />
All the scripts used in this post are located in my GitHub <a href="https://github.com/Matticusau/SQLDemos/tree/master/SQL2016AGDBHealthLevelDetection">https://github.com/Matticusau/SQLDemos/tree/master/SQL2016AGDBHealthLevelDetection</a> <br />
<br />
<br />
<br />
<b><u>References relevant to this post</u></b><br />
<br />
<a href="https://docs.microsoft.com/en-us/sql/database-engine/availability-groups/windows/sql-server-always-on-database-health-detection-failover-option" target="_blank">Availability group database level health detection failover option</a><br />
<br />
Credit to Paul Randle for how to use DBCC WRITEPAGE - <a href="https://www.sqlskills.com/blogs/paul/dbcc-writepage/">https://www.sqlskills.com/blogs/paul/dbcc-writepage/</a><br />
<br />
<br />
<a href="https://docs.microsoft.com/en-us/sql/database-engine/availability-groups/windows/configure-flexible-automatic-failover-policy" target="_blank">Configure Flexible Automatic Failover Policy</a><br />
<br />
<a href="https://docs.microsoft.com/en-us/sql/database-engine/availability-groups/windows/flexible-automatic-failover-policy-availability-group" target="_blank">Flexible Automatic Failover Policy - Availability Group</a><br />
<br />
<br />
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em><br />
<br /></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-62383133302055572042017-04-27T12:09:00.000+10:002017-04-27T12:10:38.983+10:00Installing SqlServer PowerShell module from PowerShellGallery.com<div dir="ltr" style="text-align: left;" trbidi="on">
With the release of SQL Server Management Studio 17.0 (SSMS) the SqlServer PowerShell module has finally been decoupled and now can be installed independently. To take full advantage of this change I recommend using PowerShell 5.0 (if your on Windows 10 or Server 2016 then this is your default).<br />
<br />
Full details of the SSMS 17.0 updates <a href="https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms" target="_blank">https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms</a><br />
<br />
The official installation details for SqlServer module are at<br />
<a href="https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-ps-module" target="_blank">https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-ps-module</a> but they are a bit light on currently so this post will walk through the process in more detail. You can also find the public listing of the SqlServer on <a href="https://www.powershellgallery.com/packages/SqlServer" target="_blank">PowerShellGallery</a>.<br />
<br />
Firstly check what modules you have installed.<br />
<br />
<pre class="brush:ps">Get-Module -Name SQL* -ListAvailable
</pre>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhu1SRA2DSprn6GNqO65A04c0r21inu5JPPkosxtLfezPSaKlfvMKS90NSY_qUvANHdfaerJmArAA9pqzhHw7N2nsvogCSItVVW39eqnm8SodtZJEQamW9puOpCK5ozfU9MwYyZn8vuL-a_/s1600/SqlServerPSModule.001.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="140" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhu1SRA2DSprn6GNqO65A04c0r21inu5JPPkosxtLfezPSaKlfvMKS90NSY_qUvANHdfaerJmArAA9pqzhHw7N2nsvogCSItVVW39eqnm8SodtZJEQamW9puOpCK5ozfU9MwYyZn8vuL-a_/s400/SqlServerPSModule.001.png" width="400" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
If you have installed the SQL Server DBEngine then you should have SQLPS for that particular version. On my server I upgraded SSMS 2016 to SSMS 16.0 so I have both 130 and 140 versions installed.<br />
<br />
What I don't currently have installed is the SqlServer PowerShell module, and my SQLPS module hasn't been updated with the upgrade of SSMS to version 17.0<br />
<br />
So let's download and install the latest SqlServer module.<br />
<br />
First, check your PowerShellGet is configured for the public gallery (the default setup).<br />
<br />
<pre class="brush:ps">Get-PSRepository
</pre>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj-b8ltRLXQ9u0V0kvARuRBbpB6_HFpHhegFIcPTRZqM2ivv5gBHlDZCL2_i1LfQ92nvcXjMah_w-WvLiNAdovM2XdSPy6CiXBj8pW5ja7i-eksnJ9wOy1gStIyunAZwisYD2e2m_NeL1go/s1600/SqlServerPSModule.002.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="70" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj-b8ltRLXQ9u0V0kvARuRBbpB6_HFpHhegFIcPTRZqM2ivv5gBHlDZCL2_i1LfQ92nvcXjMah_w-WvLiNAdovM2XdSPy6CiXBj8pW5ja7i-eksnJ9wOy1gStIyunAZwisYD2e2m_NeL1go/s400/SqlServerPSModule.002.png" width="400" /></a></div>
<br />
<br />
You should have the PSGallery repository setup. By default it will be untrusted, this will just mean you get prompted before installing any modules from it. It is recommended public repositories should not be marked as trusted, but internal private repositories can be to make the installations easier.<br />
<br />
If you don't have any other repositories configured then you won't have to supply the repository name to any future commands, but if you do then you can simplify the module searches by specifying the repository name. This is very useful if you want to host private custom copies of the publicly available modules on a private repository.<br />
<br />
Now you can search for the module in the gallery to check what the most recent version is available<br />
<br />
<pre class="brush:ps">Find-Module -Name SqlServer
</pre>
<br />
NOTE: If prompted to continue enter 'y'<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCeBCmEmfW0MioZwm_-fg_UJmB2zOJp60WOXFTuzPqcvH-bjo0YOEZrfQWDKCPh8ylG97xtCrWdt1BKpYcowHJDWePYEFddeMlwaaOEcirJa4NY9P61JK9ZYHOa_cRL43xuwk-3w_zgICs/s1600/SqlServerPSModule.003.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="80" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCeBCmEmfW0MioZwm_-fg_UJmB2zOJp60WOXFTuzPqcvH-bjo0YOEZrfQWDKCPh8ylG97xtCrWdt1BKpYcowHJDWePYEFddeMlwaaOEcirJa4NY9P61JK9ZYHOa_cRL43xuwk-3w_zgICs/s400/SqlServerPSModule.003.png" width="400" /></a></div>
<br />
<br />
Add the <b>-AllVersions</b> parameter to list all the available versions. Currently there is only one version in the gallery but this will change overtime.<br />
<br />
To install the module from the gallery run <b>Install-Module</b>. Use the <b>Scope</b> parameter to install for either <b>CurrentUser</b> or <b>AllUsers</b> based on your needs. With PowerShell 5.0 it is also possible to install versions side-by-side. In my case I am installing the module for all users and requiring a specific version (the current one).<br />
<br />
To avoid the error message like "<i><b>A command with name '<cmdletname>' is already available on this system</cmdletname></b></i>" I will also use the parameter <b>-AllowClobber</b>.<br />
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjEs8x1br7t5YOFZSZKFeV32mRD7uLUEsNPcjPJ-ov_4oGvOou6pbzyeMQalEXzn7BZFeHefXscLBMrOVmzxy90E24hT7mlzkt0Bfi8kvM2S4swNlULjG2QJHAmtSD3lmyz0lDTjOp7M-X2/s1600/SqlServerPSModule.004.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="131" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjEs8x1br7t5YOFZSZKFeV32mRD7uLUEsNPcjPJ-ov_4oGvOou6pbzyeMQalEXzn7BZFeHefXscLBMrOVmzxy90E24hT7mlzkt0Bfi8kvM2S4swNlULjG2QJHAmtSD3lmyz0lDTjOp7M-X2/s400/SqlServerPSModule.004.png" width="400" /></a></div>
<div>
</div>
<div>
<br /></div>
<div>
<br /></div>
<pre class="brush:ps">Install-Module -Name SqlServer -Scope AllUsers -RequiredVersion 21.0.17099 -AllowClobber
</pre>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhi37lusRtD7zqAuCVoo0JOmriCHI4NJLm-OQ_q1Kt4fw7ZMSKTwhaqjonHeLLPgl1zGreSP69DcI8faKMzVzluzWQXfBqn8qZ53FcavVqbVRsdA719M_wVkMkdGDdtQUSIoiVjzWeLyfyn/s1600/SqlServerPSModule.005.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="40" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhi37lusRtD7zqAuCVoo0JOmriCHI4NJLm-OQ_q1Kt4fw7ZMSKTwhaqjonHeLLPgl1zGreSP69DcI8faKMzVzluzWQXfBqn8qZ53FcavVqbVRsdA719M_wVkMkdGDdtQUSIoiVjzWeLyfyn/s400/SqlServerPSModule.005.png" width="400" /></a></div>
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQIom3_UgAwQ0e-yIKEc4nMxoXMiMis50rnxGXcbf_qYaWms6Kq_bmCueXKkD6U_VxdY6CGC_GVXr2DqaFXn3NPU3wnxJ5LSN6bDQezyBwRyYUT1gK2cBasCR-28M2fNlLOhTRcsRhDZcx/s1600/SqlServerPSModule.006.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="65" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQIom3_UgAwQ0e-yIKEc4nMxoXMiMis50rnxGXcbf_qYaWms6Kq_bmCueXKkD6U_VxdY6CGC_GVXr2DqaFXn3NPU3wnxJ5LSN6bDQezyBwRyYUT1gK2cBasCR-28M2fNlLOhTRcsRhDZcx/s400/SqlServerPSModule.006.png" width="400" /></a></div>
<br />
<br />
Now check that the module is available for use on the server with<br />
<br />
<pre class="brush:ps">Get-Module -Name SqlServer -ListAvailable
</pre>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmz_iTxbPMEQ2MglgqddnLYpEa2gYQSdsfAibNPKGlZCZU9c3qJnhYxjoCUB6883KOr7CqROF4oL7hTNBIMUHGZScz31WbJyDjPXz6i5BzQvqWLJgk4jC6LTVAdonCqwG_JUveamYNmEH7/s1600/SqlServerPSModule.007.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="92" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmz_iTxbPMEQ2MglgqddnLYpEa2gYQSdsfAibNPKGlZCZU9c3qJnhYxjoCUB6883KOr7CqROF4oL7hTNBIMUHGZScz31WbJyDjPXz6i5BzQvqWLJgk4jC6LTVAdonCqwG_JUveamYNmEH7/s400/SqlServerPSModule.007.png" width="400" /></a></div>
<br />
<br />
Notice that the path is now one of the common $PSModulePath paths. This is also part of the improvements that these changes bring.<br />
<br />
Now going forward to update the module you can use the <b>Find-Module</b> and <b>Update-Module</b> cmdlets. Look out for my post on automating that process next month.<br />
<br />
<br />
Happy PowerShell'ing<br />
<br />
<br />
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em><br />
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;"><br /></em>
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;"><br /></em></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-82905059150154167272017-03-28T16:45:00.000+10:002017-05-04T10:37:16.293+10:00Cmder, enhance your Windows Terminal Console experience<div dir="ltr" style="text-align: left;" trbidi="on">
One of the best new tools I have been using lately is Cmder. A Terminal Emulator for Windows which is great for a number of reasons:<br />
<br />
<ul style="text-align: left;">
<li>Tabbed terminals </li>
<li>Multiple languages can be emulated</li>
<li>Fully customisable</li>
<li>and best of all Quake mode</li>
</ul>
<div>
If you haven't seen it yet check it out <a href="http://cmder.net/" target="_blank">http://cmder.net</a></div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgVf8cwq2oB0Ga-LH8l8yVR58bLyliYBS5gnPfHyBjvSseYnNIDkaZ0SkaBo53PhvFzWlPOL9KT1T53Ss18Whj6kpp2TNoRQKsCa8pfOQ3Qog9oaJ99dvU5ec_sTsu328qsx64jGMOoxuCw/s1600/cmder-01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="175" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgVf8cwq2oB0Ga-LH8l8yVR58bLyliYBS5gnPfHyBjvSseYnNIDkaZ0SkaBo53PhvFzWlPOL9KT1T53Ss18Whj6kpp2TNoRQKsCa8pfOQ3Qog9oaJ99dvU5ec_sTsu328qsx64jGMOoxuCw/s400/cmder-01.png" width="400" /></a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
I do a lot of work with PowerShell, Git and usually have many things on the go at once. What I really like about Cmder is that you can enable Quake mode which hides the terminal window at the top of your monitor screen and allows a keyboard short cut (Ctrl+`) to show/hide it. Making it very quick to open it over the top of other applications your working in (e.g. VS Code) and run commands. For example, when I am working on development projects I usually have VS Code open to edit the source code files and then I will have a PowerShell or Bash terminal emulated in Cmder for all my Git commands on the repo. Of cause you could just use the Git feature of VS Code in this case, but I usually always find there is some conflict I have to resolve at the Git console so just prefer to work in that. Plenty of other reasons I use Cmder too.</div>
<div>
<br /></div>
<div>
To get you started here is my quick setup steps which will set it up at least with Quake mode and your favorite terminal as default.<br />
<br /></div>
<div>
<ol style="text-align: left;">
<li>Download and install Cmder from <a href="http://cmder.net/" target="_blank">http://cmder.net</a></li>
<li>Open Cmder.exe from the location you extracted it to.</li>
<li>Click on the 3 bar menu item in the bottom right<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjY8QMxwMM4thHeMQ-t8a5PMBxt_qtNKnRiEglX8DypnHjRbzsyezXcKQOCEmvXGqi_y1K08Sz5_RJE05gye2Ik6JyxHap5mHrKv9gFsswppSaGsOrAM77mV-WC7fG-yejlTQBiZOhzB-wt/s1600/cmder-settings-01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="20" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjY8QMxwMM4thHeMQ-t8a5PMBxt_qtNKnRiEglX8DypnHjRbzsyezXcKQOCEmvXGqi_y1K08Sz5_RJE05gye2Ik6JyxHap5mHrKv9gFsswppSaGsOrAM77mV-WC7fG-yejlTQBiZOhzB-wt/s400/cmder-settings-01.png" width="400" /></a></li>
<li>Select Settings<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjZ0csYqLcF9JuRsLITMCuJ5zJCHBhW-g-LxqhADzJrET4YnJIZr3sJTcCjGnJKZFNPVX5fWp2uF7mxiHZsqlQ7ed7R8GrA-936kAQuGYWQz6vzpY6M2t3dINjB4S9kYgJ0MiNRfmZhBxIk/s1600/cmder-settings-04.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="124" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjZ0csYqLcF9JuRsLITMCuJ5zJCHBhW-g-LxqhADzJrET4YnJIZr3sJTcCjGnJKZFNPVX5fWp2uF7mxiHZsqlQ7ed7R8GrA-936kAQuGYWQz6vzpY6M2t3dINjB4S9kYgJ0MiNRfmZhBxIk/s200/cmder-settings-04.png" width="200" /></a></li>
<li>I have a Surface Book as my laptop which comes with a very large resolution (3000x2000) so I found I needed to change the "<b style="font-style: italic;">Size & Pos</b>" settings, but obviously this is personal preference.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggYOzPLskAx7KHpiua2IHl8NzWPhJZpYW0pkiWqYooOxdY8l-aCyZlkPXsG8HFV5g1giQc97JbSs3x58mI2OwrcWq3GcYOUBSYvSiDuN0qplDX1LeB3upTmAigLthGu-jpYJzKa3X5Sox4/s1600/cmder-settings-05.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="179" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggYOzPLskAx7KHpiua2IHl8NzWPhJZpYW0pkiWqYooOxdY8l-aCyZlkPXsG8HFV5g1giQc97JbSs3x58mI2OwrcWq3GcYOUBSYvSiDuN0qplDX1LeB3upTmAigLthGu-jpYJzKa3X5Sox4/s320/cmder-settings-05.png" width="320" /></a></li>
<li>Click the <b><i>Main > Quake style</i></b> option in the setting menu. Ensure "<b><i>Quake style slide down</i></b>" is enabled and any other options as you require.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXEbTvHBJW38z5uB52v-A8ivVp1W1eIVgzkQ-lCxXfqSwosmdKHVv1CUoNKY2N_7N0ofsvcINyXIjDDjtlG0Aj3p4Ocab_sECKuNUzmv6uQdefB4lWB_yScQSyV1EAnNgPTuDfitukqiGX/s1600/cmder-settings-02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="122" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXEbTvHBJW38z5uB52v-A8ivVp1W1eIVgzkQ-lCxXfqSwosmdKHVv1CUoNKY2N_7N0ofsvcINyXIjDDjtlG0Aj3p4Ocab_sECKuNUzmv6uQdefB4lWB_yScQSyV1EAnNgPTuDfitukqiGX/s400/cmder-settings-02.png" width="400" /></a></li>
<li>Select the Startup option from the settings menu and select the desired default task in the "<i><b>Specified named task</b></i>" drop down.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiZovbxT9EiwyzxbcVwrVwjDwN6O6XVKzJfIQd5xEFAv5KbgDeHzjEA6jl9q2AKw8f-GG-J6tjrDAehl3hYWs67AIyMeE6a6sd3TrPnriZ7pgKyt4Jl2Yo_70AKJw2p40-DE7GjlDgepV2o/s1600/cmder-settings-03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="173" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiZovbxT9EiwyzxbcVwrVwjDwN6O6XVKzJfIQd5xEFAv5KbgDeHzjEA6jl9q2AKw8f-GG-J6tjrDAehl3hYWs67AIyMeE6a6sd3TrPnriZ7pgKyt4Jl2Yo_70AKJw2p40-DE7GjlDgepV2o/s400/cmder-settings-03.png" width="400" /></a><br /><br />This will give you the option of setting one of the default tasks as your default startup terminal. Such as Cmd.exe, PowerShell, Bash, etc. If you require anything more specific then use the Tasks section to define your own custom task. This is handy too for extending the functionality of Cmder, like setting up a SQLCMD terminal or specific PowerShell terminals for different technologies (e.g. Azure, SQL, Exchange, etc).</li>
<li>If you want to have Cmder start at logon, something I have found increasingly important as I grew to depend on Cmder for my terminal use. Select the <b><i>Integration > Default term</i></b> option in the settings menu. Then check the "<b style="font-style: italic;">Register on OS Startup</b>" option. NOTE: For this setting to work you also need to set the "<b style="font-style: italic;">Force ConEmu as default terminal for console applications</b>", and you will be prompted to do so if you don't check it first. This setting will have no effect without that setting also enabled.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhxzD4TmkXicgH3H3SJdWb_ZrcPVPXLyRGNO9R1iYVbmDadtdo6RIFKVVXt1BLFf9LDS5IYXYySGmflQ7rDSfLEmdXayrhrzY9vlruU2yh6hXxr9-AyVH7G4zRa_CBiIAF-V4AXVWKuSJCQ/s1600/cmder-settings-06.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="196" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhxzD4TmkXicgH3H3SJdWb_ZrcPVPXLyRGNO9R1iYVbmDadtdo6RIFKVVXt1BLFf9LDS5IYXYySGmflQ7rDSfLEmdXayrhrzY9vlruU2yh6hXxr9-AyVH7G4zRa_CBiIAF-V4AXVWKuSJCQ/s320/cmder-settings-06.png" width="320" /></a></li>
<li>Once you have configured the required setttings, such as Task Bar minimisation and other desired behaviors, click <b><i>Save Settings</i></b>.</li>
<li>Now return to your other applications, when you need a terminal press <b><i>Ctrl+`</i></b> (that is Ctrl plus tilt character) and the cmder terminal will load.</li>
</ol>
<div>
For documentation of any of the setting see: <a href="https://conemu.github.io/en/TableOfContents.html#settings" target="_blank">https://conemu.github.io/en/TableOfContents.html#settings</a><br />
<br /></div>
<div>
Some handy keyboard shortcuts, for the full list see <a href="http://cmder.net/">http://cmder.net</a></div>
</div>
<div>
<br /></div>
<div>
Ctrl+` : Show/Hide the cmder terminal</div>
<div>
Ctrl+Numer : Switch to that active terminal tab</div>
<div>
Shift+Alt+Number : Fast new tab (e,g, Shift+Alt+1 for Cmd.exe or Shift+Alt+2 for powershell)</div>
<div>
<br />
For the official documentation see <a href="https://conemu.github.io/en/TableOfContents.html" target="_blank">https://conemu.github.io/en/TableOfContents.html</a></div>
<div>
<br /></div>
<div>
Happy terminal emulating</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em></div>
<div>
<br /></div>
</div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-68613334402610405822017-03-28T14:48:00.000+10:002017-03-29T12:40:30.852+10:00Why VSCode has replaced Management Studio as my default SQL Database and Query editor<div dir="ltr" style="text-align: left;" trbidi="on">
Firstly let me start by stating that when I originally set out in an IT career I was heading down a developer path, and certainly had a number of developer type roles over the years, or found ways of continuing development projects while working in infrastructure roles..... probably why I have an interest in DevOps. So taking that into account it's no surprise that for my entire career I have always been comfortable working in code and not relying on GUIs. Even for all the years as a SQL DBA armed with SQL Management Studio (SSMS), yet I was always most comfortable working in TSQL rather than the wizards. Probably comes from the days of Enterprise Manager and Query Analyser (ahhhhh nostalgia). Now the MS Product Team has done a great job at improving the wizards in SSMS and making tasks as easy as they can be in the tools. I will also state that this post is by no means saying SSMS is dead because there are just some things where it is better positioned.<br />
<br />
What I will cover in this blog post is why my go to TSQL editor and tool for general database work is now VSCode with the MSSQL extension.<br />
<br />
Please don't take this as a statement that I have now uninstalled SSMS or Visual Studio with SQL Data Tools (SSDT) from my laptop, I wish, but I have always found those tool a bit bloated with memory consumption when all I want to do is connect to a database, run some queries, or make some basic changes. What I will show is why/how I now perform those tasks with VS Code, but for anything more in depth like designing SSIS packages or performance troubleshooting I still rely on the existing tools (for now).<br />
<br />
Another factor that is driving this adoption of a text based editor is that a large amount of my work is now with Azure and other cloud solutions, and for the majority of the work you need to do it is largely console or script based.<br />
<br />
Now that you know why I have arrived at this place, lets get into how I setup and use VS Code for this purpose. I look forward to healthy discussions with people around this because I am not a believer of the "one size fits all" approach to a tool set either so it is always great to hear what others use.<br />
<br />
<b><u><br /></u></b>
<b><u>Setup and configure your environment</u></b><br />
Here are the steps I use to setup my VSCode environment:<br />
<br />
<ol style="text-align: left;">
<li>Download and install VSCode <a href="https://code.visualstudio.com/download" target="_blank">https://code.visualstudio.com/download</a></li>
<li>Open VS Code</li>
<li>Press <b><i>Ctrl+Shift+X</i> </b>(on windows)</li>
<ol>
<li>Alternatively use the View > Extensions menu item</li>
</ol>
<li>Locate and install the following extensions</li>
<ol>
<li>vscode-icons</li>
<li>mssql</li>
<li>powershell</li>
<li>c#</li>
</ol>
<li>Configure the extensions</li>
<ol>
<li>From File > Preferences > File Icon Theme select "VSCode Icons"<br />This will ensure that any files you open and access have nicely displayed icons to make your experience easier.</li>
</ol>
<li>Configure the environment settings</li>
<ol>
<li>From File > Preferences > Settings</li>
<ol>
<li>VS Code works in two setting modes, User and Workspace. User should be personal preferences and Workspace should be used for project specific settings that will ship with the repo.<br />User settings are stored in the file <i>C:\Users\<username>\AppData\Roaming\Code\User\settings.json</username></i> but you shouldn't have to edit that manually as the VS Code window provides the best method for working with these files.</li>
</ol>
<li>I don't change too many settings at this time from the default, but some to consider depending on your needs are:<br /><a href="https://github.com/Microsoft/vscode-mssql/issues/549" target="_blank">mssql.splitPaneSelection</a> = "current|next|end"<br /><br />IntelliSense will help you complete the values if you need to see what is available.</li>
</ol>
<li>Now you should be ready to start working inside VS Code. However, I recommend reading the release notes when new updates are made as the developer community is extremely active improving VS Code and there is always new and useful features being added.</li>
</ol>
<div>
<br /></div>
<div>
While VSCode has a built-in integrated terminal, I like the <b>cmder</b> tool for my terminal use. If you aren't familiar with cmder check it out, very versatile, run multiple terminals and languages. Best of all a Quake mode. </div>
<div>
<a href="http://cmder.net/" target="_blank">http://cmder.net</a></div>
<div>
For more info check out my post on it <a href="http://blog.matticus.net/2017/03/cmder-enhance-your-windows-terminal.html" target="_blank">http://blog.matticus.net/2017/03/cmder-enhance-your-windows-terminal.html</a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<b style="text-decoration: underline;">Connecting to a database and executing SQL queries</b><br />
There are many tricks and ways to work within VS Code but here is a simply walk through on the basics to get you started.</div>
<div>
<ol style="text-align: left;">
<li>Open VS Code if you haven't already</li>
<li>You do not need to open a folder or save files just to run queries but it could be beneficial. Think of a folder like a Project/Solution, but in a simplier (faster) format. This works great with Git and cross platform collaboration.<br />For the case of this walk through just create a new file (click <b style="font-style: italic;">New File</b> on the welcome page)</li>
<li>Without saving the file, lets make sure we are in the right language mode. <br /><br />Click the current language in the tray menu (e.g. Plain Text)<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTp-gTmk5oEXo_5dIe9My2tL9g_NgvnYCvvQQl9yPlL__c0kAKFJUJG61ZskFoTyCiwNp-Xnp5UNwO6IUKa15WrkM0MAcDRfkt7AvJNkN3Kx4Dd-GJEFo0e9crP3rXo3K8ATxbjDUVtq3-/s1600/LanguageMode-01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="52" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTp-gTmk5oEXo_5dIe9My2tL9g_NgvnYCvvQQl9yPlL__c0kAKFJUJG61ZskFoTyCiwNp-Xnp5UNwO6IUKa15WrkM0MAcDRfkt7AvJNkN3Kx4Dd-GJEFo0e9crP3rXo3K8ATxbjDUVtq3-/s320/LanguageMode-01.png" width="320" /></a><br /><br />This will open the command palette with prompts to select the language. Either browse or type to find your language and select it.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8JgdWHEjw9xti-Nx0xU-pgg4SQ1FT-6MtfHKEN2Vp2ntl_Hy3AIZ0QCrdqCsn0fSkU48R_c3_8ud-Jlb502Be9ZNTPNBWKKkuyHVub2jL-csU0mVXV0yda8pOP8rWNhnFKNRgbk2RMKL-/s1600/LanguageMode-02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="164" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8JgdWHEjw9xti-Nx0xU-pgg4SQ1FT-6MtfHKEN2Vp2ntl_Hy3AIZ0QCrdqCsn0fSkU48R_c3_8ud-Jlb502Be9ZNTPNBWKKkuyHVub2jL-csU0mVXV0yda8pOP8rWNhnFKNRgbk2RMKL-/s320/LanguageMode-02.png" width="320" /></a><br /><br />Now the correct SQL language is shown in the tray menu<br /><br /> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjfTO2Fp-oh2DCoycamlVOXdv1zDBWcJeF8G1aAL9O-1AirxYGuWPeTnXilVuOtJJQTvbrB-xraj_87jZK2N1QGfOWXk5WuP7Z2aKLj2HIVJzTULlYhXurSmqzWZUZQrGERBfISUQ3XKNXo/s1600/LanguageMode-03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjfTO2Fp-oh2DCoycamlVOXdv1zDBWcJeF8G1aAL9O-1AirxYGuWPeTnXilVuOtJJQTvbrB-xraj_87jZK2N1QGfOWXk5WuP7Z2aKLj2HIVJzTULlYhXurSmqzWZUZQrGERBfISUQ3XKNXo/s1600/LanguageMode-03.png" /></a><br /><br />Now the color coding and formatting, along with IntelliSense, will be suitable for SQL Server development.<br /><br />TIP: When you save a file then the language mode is automatically detected based on the file extention.</li>
<li>Press <b><i>Ctrl+Shift+P</i></b> to open the Command Palette</li>
<li>Type "mssql" and select the <b><i>mssql: Connect </i></b>option or press <b><i>Ctrl+Shift+C</i></b><br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgZ1KNBR3IYc5HMWqhnX2Mlm7Fib1lhHdFG9pAP7M9No-MQ_s4hhZ8pVi5-sJh61I-aJdmuRr1JPrybEGENN9wP-B9FO4ZCBc4twSGv0x3JJ_Yw9kRSCgKZ2vnxkACs2eX5lmFiBCQlpCkv/s1600/CommandPalette-mssql-connect.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="130" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgZ1KNBR3IYc5HMWqhnX2Mlm7Fib1lhHdFG9pAP7M9No-MQ_s4hhZ8pVi5-sJh61I-aJdmuRr1JPrybEGENN9wP-B9FO4ZCBc4twSGv0x3JJ_Yw9kRSCgKZ2vnxkACs2eX5lmFiBCQlpCkv/s320/CommandPalette-mssql-connect.png" width="320" /></a><br /><br />TIP: Make sure your focus is in a file with the SQL language set and not any other areas of VSCode when you press Ctrl+Shift+C as otherwise it will open a console as per those keyboard shortcuts default.</li>
<li>Select an existing connection profile or select the <b><i>Create Connection Profile</i></b> to create a new one. So lets create one.</li>
<li>Follow the wizard filling out your server/instance, database (optional), authentication etc.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsucLFZIRjEUCQh-wZ2cj8OnhPrEm4-ry5WmCecfNkZXZkQvh3gs4fC1YZNc3APuTd5Pm5sqCtspvgHG1fvhN1dl6zr7g8rUY9g7zlzGdI81ac1Bzww0odA_RwWYaoKdPVgO9BrSJbkFfL/s1600/CommandPalette-mssql-connect-02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="40" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsucLFZIRjEUCQh-wZ2cj8OnhPrEm4-ry5WmCecfNkZXZkQvh3gs4fC1YZNc3APuTd5Pm5sqCtspvgHG1fvhN1dl6zr7g8rUY9g7zlzGdI81ac1Bzww0odA_RwWYaoKdPVgO9BrSJbkFfL/s320/CommandPalette-mssql-connect-02.png" width="320" /></a><br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgtsxqyOrRZ0XHSe4xfjekXnl-zqBFLd1nKfMAKVla9GaxantSrE5nEIJaM-Z8m_oMek8Y4rG3JMOdFUv5K6oFFp-1r4xOnZ_KI5UOFzJbhbKKIMqgtHPptEu49xLv44S9k38x0Pgh0RnZL/s1600/CommandPalette-mssql-connect-03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="38" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgtsxqyOrRZ0XHSe4xfjekXnl-zqBFLd1nKfMAKVla9GaxantSrE5nEIJaM-Z8m_oMek8Y4rG3JMOdFUv5K6oFFp-1r4xOnZ_KI5UOFzJbhbKKIMqgtHPptEu49xLv44S9k38x0Pgh0RnZL/s320/CommandPalette-mssql-connect-03.png" width="320" /></a><br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjRBerO2C-aM9garnZfq5FUZRsryM_psiTt-3A-2i_OSIrCBQtc-zmfnTBXuUftFjjuJQDzPngs4ppjCR0eaRmOw18Cja1wPNZVmo89_WBzJ6t4UKdmYnQa6yZ7HodgN0A5luZwwb-WTpxO/s1600/CommandPalette-mssql-connect-04.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="50" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjRBerO2C-aM9garnZfq5FUZRsryM_psiTt-3A-2i_OSIrCBQtc-zmfnTBXuUftFjjuJQDzPngs4ppjCR0eaRmOw18Cja1wPNZVmo89_WBzJ6t4UKdmYnQa6yZ7HodgN0A5luZwwb-WTpxO/s320/CommandPalette-mssql-connect-04.png" width="320" /></a><br /><br />Once you start to connect the status is shown in the tray menu<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgnKZb8wEjA_QrOQfUfP70LJFXjKgT5jVvUKrE-Og9VVzoR5p407Jruv1Qpa1MXCxAfmb30jGbRrVg7uyE2nDP3OZwvC1oCQ2QZ3pzJdq_fGeLN7wMfp-og4U4amvqeKTu_DvjhtMwgBcPE/s1600/CommandPalette-mssql-connect-05.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="33" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgnKZb8wEjA_QrOQfUfP70LJFXjKgT5jVvUKrE-Og9VVzoR5p407Jruv1Qpa1MXCxAfmb30jGbRrVg7uyE2nDP3OZwvC1oCQ2QZ3pzJdq_fGeLN7wMfp-og4U4amvqeKTu_DvjhtMwgBcPE/s320/CommandPalette-mssql-connect-05.png" width="320" /></a><br /><br />Any errors connecting will be shown with an overlay<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjHhYR85F4QFLg917dEHdxVV_JnRx4mk1dpI2jumUEeHa0P-euEkIi5KxZIbc-85wRLehZhaOPvvFC6dKkIdhhmeue8IfpwOOM-sqOKfCzmYLKs53AE5fPfr01bcW0oSaZjoTAvfZm7a20l/s1600/CommandPalette-mssql-connect-06.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="24" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjHhYR85F4QFLg917dEHdxVV_JnRx4mk1dpI2jumUEeHa0P-euEkIi5KxZIbc-85wRLehZhaOPvvFC6dKkIdhhmeue8IfpwOOM-sqOKfCzmYLKs53AE5fPfr01bcW0oSaZjoTAvfZm7a20l/s320/CommandPalette-mssql-connect-06.png" width="320" /></a><br /><br />Once connected VS Code will update intellisense dictionary and perform other operations set by the extension.</li>
<li>Now write your query in the file</li>
<li>When ready you can execute the query in a few methods<br /><br />Use the Command Palette and the <b><i>MSSQL: Execute Query </i></b>command.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2XG_0GNOGpXtQCG1FOvuRXzA_6N8RvWvlKNo4Phr-LkPkUJZXshqNxKLHVJhwdATneqltqDNzjxdD5XTqIT0gyFSSqthVQIhJYfiaGGqE_96DihPMkrflzBFgJ9_yQuuQR0v9MzzcpTNk/s1600/CommandPalette-mssql-execute-01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="36" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2XG_0GNOGpXtQCG1FOvuRXzA_6N8RvWvlKNo4Phr-LkPkUJZXshqNxKLHVJhwdATneqltqDNzjxdD5XTqIT0gyFSSqthVQIhJYfiaGGqE_96DihPMkrflzBFgJ9_yQuuQR0v9MzzcpTNk/s320/CommandPalette-mssql-execute-01.png" width="320" /></a><br /><br />Right click in the editor and select<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgEFI9j8R9Z1vsxX9EiwXlJoLKwVsy8lXPx1F8EaUed0_EPofrJ_SIRoMa66B7aeLjepPQgyVOpVYb1HA44OiKNMb1hiZScZYutXOWjIjbwkfDqqbAkMm03rv19iunTZEGTQgjOnmOcxTYa/s1600/CommandPalette-mssql-execute-02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="248" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgEFI9j8R9Z1vsxX9EiwXlJoLKwVsy8lXPx1F8EaUed0_EPofrJ_SIRoMa66B7aeLjepPQgyVOpVYb1HA44OiKNMb1hiZScZYutXOWjIjbwkfDqqbAkMm03rv19iunTZEGTQgjOnmOcxTYa/s320/CommandPalette-mssql-execute-02.png" width="320" /></a><br /><br />or my favorite just simply press <b><i>Ctrl+Shift+E</i></b></li>
<li>The query results tab will open. By default this opens in a new split window column, or the next one if you have multiples. The idea here is so you can see the query and result all in one window.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMPZryO26cdQFlllgJv9c8x-lXiqNXxbckrsB9QfF1-vPlTfaTy3Yke5ZSPYWDgbFtP2_-C3qYuDdDk19v_9kQ2fgCRo0xZi8ZFgW7TQzCJKfbZmDXA0o1sU-z0BOZeA3xLMxm4JpTuUzx/s1600/CommandPalette-mssql-results-01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="135" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMPZryO26cdQFlllgJv9c8x-lXiqNXxbckrsB9QfF1-vPlTfaTy3Yke5ZSPYWDgbFtP2_-C3qYuDdDk19v_9kQ2fgCRo0xZi8ZFgW7TQzCJKfbZmDXA0o1sU-z0BOZeA3xLMxm4JpTuUzx/s320/CommandPalette-mssql-results-01.png" width="320" /></a><br /><br />You can put the query results at the bottom of the screen which might be a more familiar view to those use to SSMS. To do this select the <b><i>Toggle Editor Group Layout </i></b>from the View menu, or press <b><i>Alt+Shiftt+1</i></b>.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiNqs68IWvkJYUPmk4cvxfoLZKMx2Wr4jhaqvTc1Wh3ba0Km-wPXx845NDt0wz2ERkrO9topUOeqUn2KO4t9OjHVAGQKDW6UUBcfX_ZFLLeHumUzUbC56baYa8465ghOIKdIg1luk7wFsP6/s1600/CommandPalette-mssql-results-03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiNqs68IWvkJYUPmk4cvxfoLZKMx2Wr4jhaqvTc1Wh3ba0Km-wPXx845NDt0wz2ERkrO9topUOeqUn2KO4t9OjHVAGQKDW6UUBcfX_ZFLLeHumUzUbC56baYa8465ghOIKdIg1luk7wFsP6/s320/CommandPalette-mssql-results-03.png" width="216" /></a><br /><br />Now the results are below the query you executed.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-TC327kI1lUEoXJ1NjNF53S9ylWES5yjuSI13NBIo0f4hylkYS7tPNCmeTgwm1X9QNpc0GWABChdT1VIGn0VvraUA_1Yjx4h2pV7ibiJU3DXe4Qgm-ODiQFykGolmXNchVxPrhEP_v7wZ/s1600/CommandPalette-mssql-results-04.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-TC327kI1lUEoXJ1NjNF53S9ylWES5yjuSI13NBIo0f4hylkYS7tPNCmeTgwm1X9QNpc0GWABChdT1VIGn0VvraUA_1Yjx4h2pV7ibiJU3DXe4Qgm-ODiQFykGolmXNchVxPrhEP_v7wZ/s320/CommandPalette-mssql-results-04.png" width="320" /></a><br /><br />Alternatively you can also set the query results to display in the current split window column (e.g. new tab)<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEGoiDjfvk0G1E9vpwkxYYNEJ5zjrXGepPZiBIoTM7kJ23ALDzgu4Pf96PvfNM-3vbP8pmmRwmbb0Ha4J75-Z5RSNJbiQYo36JRf95IMRZhKi5m8Udw2jPrSSjo64ZjL_M4MsLUgwaGopc/s1600/CommandPalette-mssql-results-02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="239" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEGoiDjfvk0G1E9vpwkxYYNEJ5zjrXGepPZiBIoTM7kJ23ALDzgu4Pf96PvfNM-3vbP8pmmRwmbb0Ha4J75-Z5RSNJbiQYo36JRf95IMRZhKi5m8Udw2jPrSSjo64ZjL_M4MsLUgwaGopc/s320/CommandPalette-mssql-results-02.png" width="320" /></a><br /><br />So as you can see you can customise where the results are displayed just like in SSMS.<br /><br />Something to keep in mind is that a new result tab will open for every file you execute a query from, but if you re-run a query or a new query from the same file then it will use the existing results tab for that file.</li>
<li>Now just like the query editor in SSMS, it will either execute the entire file contents or what you have selected. So like in this example it will just execute the selected query and not the entire file contents.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCff_78_ISlsdFbk6NxUVDYzqTnj9V1phyhkxWjn3CFZgjdG-hLzzplQtF4hMFL1moKe4ltKWaW6xeS9s1Po1peQsIkHLBIz-u9ccu7SQ8a34YFQJsNa0uUDf6IkRg9TOedGWBDGhJTozY/s1600/CommandPalette-mssql-execute-03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="241" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCff_78_ISlsdFbk6NxUVDYzqTnj9V1phyhkxWjn3CFZgjdG-hLzzplQtF4hMFL1moKe4ltKWaW6xeS9s1Po1peQsIkHLBIz-u9ccu7SQ8a34YFQJsNa0uUDf6IkRg9TOedGWBDGhJTozY/s320/CommandPalette-mssql-execute-03.png" width="320" /></a><br /><br />This is why I like the keyboard shortcut <b><i>Ctrl+Shift+E</i></b> to execute queries because it becomes really quick to work from a file and execute different selected queries as desired.</li>
</ol>
</div>
<div>
Obviously some people will really miss Object Explorer to understand the schema's of databases they are not familiar with, but keep in mind VSCode is designed for developers and so typically you would have a folder that contains all the scripts for creating the database and therefore your schema to refer to, or you would be familiar with the schema. However, as we all know there are plenty of views you can easily query to get that data (because after all that is all Object Explorer does). </div>
<div>
<br /></div>
<div>
<br /></div>
<div>
Happy SQL Scripting.</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<div>
<b><u>Registering your SQL Server connections</u></b></div>
<div>
VS Code has a json based configuration system. SQL Connections can be saved in the User Settings file, think along the lines of "Registered Servers" in SSMS. I have already briefly touched on how to create a new profile when you connect. However here is how to register them ahead of time and manage existing connection profiles.</div>
<div>
<br /></div>
<div>
Keep in mind though, these connections are not unique to a project/solution/folder, they are unique to your user settings. So you make sure you give them meaningful names to easily identify which databases/projects they belong to.</div>
<div>
<ol>
<li>Press <b><i>Ctrl+Shift+P</i></b> to open the Command Palette</li>
<li>Type "mssql" and select the <b><i>MSSQL: Manage Connection Profiles</i></b> option<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhY57aAGkdH7Q8FJhR_Xd01gAPC4Apzmpkon5x0K1y_10U4RR9AQY2PTfoD1Uf1AhTLSkIJVAGmy7xqPEG-U0geRtEcy4941WQkkDTbEbngRVBu3WiHexZORYtDHjsniJp-ku5XiHjk5W6E/s1600/CommandPalette-mssql-manageconnection-01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="133" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhY57aAGkdH7Q8FJhR_Xd01gAPC4Apzmpkon5x0K1y_10U4RR9AQY2PTfoD1Uf1AhTLSkIJVAGmy7xqPEG-U0geRtEcy4941WQkkDTbEbngRVBu3WiHexZORYtDHjsniJp-ku5XiHjk5W6E/s320/CommandPalette-mssql-manageconnection-01.png" width="320" /></a></li>
<li>The Command Palette will then prompt you with some more options.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGyJ1agE0-6FpnvqCfWbBHrBlFmpJ65rSW6yW3kD4TZDyKpKNogUzMfVKBcQJxmCoNoQQs3PHvwgPgCIBXcT5JbIbQW6vghjQf_YJg1nH8bR6_0wT5MJDJibz9gX0-4wiFAfgq58HkckZ8/s1600/CommandPalette-mssql-manageconnection-02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="68" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGyJ1agE0-6FpnvqCfWbBHrBlFmpJ65rSW6yW3kD4TZDyKpKNogUzMfVKBcQJxmCoNoQQs3PHvwgPgCIBXcT5JbIbQW6vghjQf_YJg1nH8bR6_0wT5MJDJibz9gX0-4wiFAfgq58HkckZ8/s320/CommandPalette-mssql-manageconnection-02.png" width="320" /></a><br /><br /><b><i>Create</i></b>: This will walk you through creating a new profile via the Command Palette prompts<br /><b><i>Edit</i></b>: This will open the User Settings JSON file and allow you to manually edit the connection profiles. NOTE: Passwords can be saved in an encrypted form but are not stored in this file for security.<br /><b><i>Remove</i></b>: This will walk you through removing an existing profile via the Command Palette prompts<br /><br />This is an example of the JSON configuration provided with the <b><i>Edit </i></b>option.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgGS7LeIxp_X74w7Qoi6LRIhS_HJxdK8m9ari4DAfQ8tT7QtC7D_9VvsQ3LYSko_1xVlZpnYbEr7GxE70s_gFd1uU3BEVPRped1JgVboyWZkLWJkqNlbypdfrVTjHDt7ITmNM_qYwHlt3Vr/s1600/CommandPalette-mssql-manageconnection-03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="171" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgGS7LeIxp_X74w7Qoi6LRIhS_HJxdK8m9ari4DAfQ8tT7QtC7D_9VvsQ3LYSko_1xVlZpnYbEr7GxE70s_gFd1uU3BEVPRped1JgVboyWZkLWJkqNlbypdfrVTjHDt7ITmNM_qYwHlt3Vr/s320/CommandPalette-mssql-manageconnection-03.png" width="320" /></a></li>
<li>Once you have configured the profile you can then simply select it from the list provided under the <b><i>MSSQL: Connect</i></b> command.</li>
</ol>
</div>
</div>
<div>
<br /></div>
<div>
<b><u>References</u></b></div>
<div>
VS Code official site <a href="https://code.visualstudio.com/">https://code.visualstudio.com/</a><br />
VS Code opensource repo <a href="https://github.com/Microsoft/vscode">https://github.com/Microsoft/vscode</a></div>
<div>
VS Code Extensions <a href="https://marketplace.visualstudio.com/VSCode">https://marketplace.visualstudio.com/VSCode</a><br />
Cmder <a href="http://cmder.net/">http://cmder.net/</a><br />
<br /></div>
<div>
<br /></div>
<div>
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em></div>
<div>
<br /></div>
<div>
<br /></div>
</div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-71335409791670971992017-02-02T11:18:00.002+10:002017-02-02T11:18:57.994+10:00Fixing Merge conflicts in VS Code projects and VSTS (Visual Studio Online) repos<div dir="ltr" style="text-align: left;" trbidi="on">
I use a combination of GitHub and Visual Studio Team Services (Visual Studio Online, VSTS, etc) repositories. GitHub is pretty straight forward for resolving merge conflicts of pull requests through it's online portal. Typically when I have come across merge conflicts in my VSTS repositories it has been with Visual Studio projects and so I could resolve them with the VS GUI. However I recently experienced a merge conflict with a pull request for one of my VSTS repositories where the code was written in Visual Studio Code. For working with the VSTS GIT repo I use the posh-git which comes with the GitHub Desktop install, but the logic here should transfer.<br />
<br />
The situation I faced was that I had been working in a branch and editing a file, which had also been edited by a colleague in the master branch (I know, I know...). So when I pushed my changes from my local repo to VSTS and then used the web portal to create a Pull Request to merge the changes back into the master branch I received an error that there was a merge conflict blocking the pull request.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjvB203lUfF6ULDAaXNAIwg09SpnuFOo70VQnMUSvUTvTeXrcvrz3Trs8yEhRB6sDvI5Hpi2ImcJJRpGPwbcBQgKTtc-yvGJ1UOkWYAij70B-Dn6pklKbvh7W7sflNZj95hIRnL0nEjrS6L/s1600/001.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="97" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjvB203lUfF6ULDAaXNAIwg09SpnuFOo70VQnMUSvUTvTeXrcvrz3Trs8yEhRB6sDvI5Hpi2ImcJJRpGPwbcBQgKTtc-yvGJ1UOkWYAij70B-Dn6pklKbvh7W7sflNZj95hIRnL0nEjrS6L/s320/001.png" width="320" /></a></div>
<br />
As I mentioned normally in this situation I am use to there being options provided to select which file/change you want to be kept. In VSTS i was not able to find any such option, and all articles online seemed to indicate how to fix this in Visual Studio GUI. While I do have VS installed I didn't want to use that as I am trying to just move to using VS Code and GIT command line where possible. I never did think to try the git menus in VS Code, or the GIT GUI..... and to be honest there is probably a better way to resolve this but this is how I resolved the conflict.<br />
<br />
First I abandoned the Pull Request using the button in the screen shot above.<br />
<br />
Next I ran <b>git status</b> to make sure that the "working tree is clean"<br />
<br />
Then by looking at the history of the file in the VSTS portal within my branch, I tracked down the original commit id that the file was initialised with. Taking that identifier I ran <b>git reset InitialCommitId PathToFile</b> (e.g. <b>git reset e99a################################c60 XML\9.xml</b><br />
<br />
Now when you run git status it should so the file is unstaged. It will also show that there are changes for the same file to be committed. We need to discard the changes in the working directory (resetting it to the initial version the branch was created from). So run <b>git checkout -- XML\9.xml</b><br />
<br />
Now git status should show a change to be committed. Commit the change with <b>git commit -m "description"</b><br />
<br />
Your working directory should now show it is ahead of the origin branch, so push the changes upstream with <b>git push</b>.<br />
<br />
Now your branch in VSTS does not contain any conflicts. I wasn't able to reactivate the existing pull request and have it succeed (it still showed the conflict), so I created a new fresh pull request for the branch and it was no longer blocked by a merge conflict.<br />
<br />
Hopefully this helps others. If you have other suggestions on working around this please comment below.<br />
<br />
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-79302288063008011802016-12-20T16:24:00.002+10:002016-12-20T20:03:47.403+10:00Why is the with_copy option critical when restoring a database configured with StretchDb<div dir="ltr" style="text-align: left;" trbidi="on">
StretchDB is a new capability in SQL Server 2016 which provides the ability to effectively partition your data between warm and cold data, and more importantly allow SQL Server to automatically host that data in an Azure SQL Database.... sounds great in concept and in cases where you have a large amount of cold data which just needs to be queried and not updated (e.g. archived data for reporting).<br />
<br />
Now, there is one critically thing you really need to be aware of and that is when it comes time to Restore or more specifically Migrate the on-premises database that is stretched. Let's take this scenario for example.<br />
<br />
<i>You are planning to upgrade the production SQL Server to SQL Server 2016 SP1. The business needs do not allow for an in-place upgrade, so you have built a new server and are planning to migrate production database to the new server.</i><br />
<br />
It could be a DR scenario but more likely the issues I am talking about will be during migrations.<br />
<br />
The issue you need to be aware of is that after restoring the on-premises database you then need to reauthorize the connection to the Azure SQL Database (official details at <a href="https://msdn.microsoft.com/en-us/library/mt733205.aspx">https://msdn.microsoft.com/en-us/library/mt733205.aspx</a>). When you perform this step, you have to specify the parameter <b>WITH_COPY </b>which can either point the database to the existing remote table in the Azure SQL Database, or create a new copy of the remote table to use. The recommended approach is to use a new copy of the remote table (<b><i>with_copy = 1</i></b>). I would also recommend this approach for database migrations, and then manually clean up the old table. The reason I recommend this is as I will show in this post, if your old/current database is still in use and you point a new database to the same remote table, you can experience conflicts between the data reconciliations and this will result in <b><span style="color: red;">cold data loss</span></b>.<br />
<br />
<br />
<b><u>So let's explore the issue.</u></b><br />
<br />
Firstly here is the environment I am working in:<br />
<br />
On-premises SQL Server 2016 SP1 (13.0.4001.0). I have one database StretchDBDemo which contains the following tables:<br />
<br />
<ul style="text-align: left;">
<li>dbo.Department</li>
<ul>
<li>DepartmentID, DepartmentName, OfficeLoc</li>
</ul>
<li>dbo.People</li>
<ul>
<li>PersonID, FullName, IsEmployee, PhoneNumber, EmailAddress</li>
</ul>
</ul>
<br />
The full schema is available in the script at the end of this post.<br />
<br />
We are going to use the <b><i>dbo.People </i></b>table to demonstrate this issue. So lets start with some data.<br />
<pre class="brush:sql">INSERT INTO [dbo].[People] ([PersonID], [FullName], [IsEmployee], [PhoneNumber], [EmailAddress]) VALUES (1, 'Matt', 1, '0','yourname@email.com');
INSERT INTO [dbo].[People] ([PersonID], [FullName], [IsEmployee], [PhoneNumber], [EmailAddress]) VALUES (2, 'Steve', 0, '0','yourname@email.com');
</pre>
<div>
<br /></div>
<div>
Next step is to setup StretchDB feature. This has to be done through the GUI and is pretty straight forward but here are the steps I have used:</div>
<div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<ol style="text-align: left;">
<li>Right click on the Database StretchDBDemo in Object Explorer</li>
<li>Select Tasks > Stretch > Enable</li>
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<ol>
<li>In the GUI select Next</li>
<li>Place a tick in the box next to the dbo.People table. Notice the warning icon. Important to note as it does have some relevance to the behaviour we will explore.<br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjhOMQA1R2axsDB5e5SeF2pM5wuhRjospEq7Ht_HEp9hjdu8cC4lFvVGbY-SCvYh5zcd7-bHVpyNY_hh9UFMoVnuDWE4ltjliOXhWSSFsq17rsDyOylWiZScYsod39WovHY-XUTF7Bt8nPd/s1600/001.PrimaryKeyWarning.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjhOMQA1R2axsDB5e5SeF2pM5wuhRjospEq7Ht_HEp9hjdu8cC4lFvVGbY-SCvYh5zcd7-bHVpyNY_hh9UFMoVnuDWE4ltjliOXhWSSFsq17rsDyOylWiZScYsod39WovHY-XUTF7Bt8nPd/s320/001.PrimaryKeyWarning.png" width="320" /></a></li>
<li>Click the "Entire Table" link under the Migrate column to launch the filter wizard</li>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<ol>
<li>Setup a filter called IsEmployee False, with the predicate "IsEmployee = 0". Click the Check button and then Done.<br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCtg56B6YBxYs0y1NN44nqB6fZiMqrYGGmQhd-O0bgsAGwL_KwkTNzIi3Sz_pkerQoF2CX_toPK4u_frpD4HuJa6O8PjYLw5lsKTq94yAuUbDZXsjCzXD_HVE3LHmNObPHWndscE7w_nji/s1600/002.StretchDbFilter.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCtg56B6YBxYs0y1NN44nqB6fZiMqrYGGmQhd-O0bgsAGwL_KwkTNzIi3Sz_pkerQoF2CX_toPK4u_frpD4HuJa6O8PjYLw5lsKTq94yAuUbDZXsjCzXD_HVE3LHmNObPHWndscE7w_nji/s320/002.StretchDbFilter.png" width="320" /></a><br />NOTE: If you don't have any rows in the dbo.People table that match this predicate you won't be able to proceed past this point.</li>
</ol>
<li>Click Next and then authenticate to Azure. </li>
<li>Select either an existing Azure SQL Database Server or create a new StretchDB server.</li>
<li>Follow the wizard to completion. I recommend taking a screen snippet of the summary page for reference.<br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhSjqwOVu7iPTjuR0HZz4lo3wnnw1qBCCB6ZrBFwKjLlYhyL8VzOpQAreAUAuIzpV96ZKiyiQaZujFw2ENjIVt4S5RwaYJ2Shdb5gZbdwFke432DylnFHSKR4XNgHPOAvui81WWaviPabk_/s1600/002.StretchSummary.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="187" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhSjqwOVu7iPTjuR0HZz4lo3wnnw1qBCCB6ZrBFwKjLlYhyL8VzOpQAreAUAuIzpV96ZKiyiQaZujFw2ENjIVt4S5RwaYJ2Shdb5gZbdwFke432DylnFHSKR4XNgHPOAvui81WWaviPabk_/s200/002.StretchSummary.png" width="200" /></a></li>
</ol>
<li>You can then check that the StretchDb feature is working by using the Monitor from the Stretch menu on the Database object.<br /><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhws3ZI-SDAYl2EB_wbWMkiMRl6gJdYC9aZoIzSo-iLS7nSIAnujnhGOSh6Vv_kGo4S44-u-pVWOEai_64XjHk9o6WFSsNsKebfE1fmVCRfFg762LWHAWHr_ChHGefxjFzwiwU8hhYcwnps/s1600/003.StretchDbMonitor.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="297" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhws3ZI-SDAYl2EB_wbWMkiMRl6gJdYC9aZoIzSo-iLS7nSIAnujnhGOSh6Vv_kGo4S44-u-pVWOEai_64XjHk9o6WFSsNsKebfE1fmVCRfFg762LWHAWHr_ChHGefxjFzwiwU8hhYcwnps/s320/003.StretchDbMonitor.png" width="320" /></a><br /><br />You could also query the following tables to check this data manually
<pre class="brush:sql">SELECT * FROM sys.dm_db_rda_migration_status
SELECT * FROM sys.dm_db_rda_schema_update_status
</pre>
</li>
<li>You will need to check the configuration of the StretchDb, this is critical as we need the Azure SQL Server address.<br /><br />-- the db config
<pre class="brush:sql">SELECT * FROM sys.remote_data_archive_databases
-- the tables config
SELECT * FROM sys.remote_data_archive_tables
-- the data source
SELECT * FROM sys.external_data_sources
</pre>
</li>
<li>Now that you have the remote server's address, in SSMS Object Explorer connect to the Azure SQL Database server</li>
<ol>
<li>Expand the Databases branch in Object Explorer, expand your stretched remote database (the name is in the tables queried above). Then expand Tables. Note how no tables are displayed here. Any stretched tables are deliberately hidden and you should not query those tables in general practice. However we are troubleshooting/demonstrating an issue so we will query the tables directly.<br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0H4VIznJol9T1l-bbtkS5tO8YLruaV2MHAB4MlNMwl992emsw1bIlFY54dYs3jHaNI5fP8R8cD86pOLMGl5JzZ3nIaNuStsszMPqNDMqkWYiZ10t73Lca-Jr4JshiDgsmox6FGrRPNy5f/s1600/004.AzureSqlServer.StretchDBTables.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="136" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0H4VIznJol9T1l-bbtkS5tO8YLruaV2MHAB4MlNMwl992emsw1bIlFY54dYs3jHaNI5fP8R8cD86pOLMGl5JzZ3nIaNuStsszMPqNDMqkWYiZ10t73Lca-Jr4JshiDgsmox6FGrRPNy5f/s200/004.AzureSqlServer.StretchDBTables.png" width="200" /></a></li>
<li>Query sys.Tables to find the name of the stretched table</li>
<li>Now query the stretched Table in the Azure SQL Database. You should have 1 record for 'Steve'.</li>
</ol>
<li>Lets add a new record to our on-premises database that will be stretched to the remote server.
<pre class="brush:sql">INSERT INTO [dbo].[People] ([PersonID], [FullName], [IsEmployee], [PhoneNumber], [EmailAddress]) VALUES (3, 'Chris', 0, '0','yourname@email.com');
</pre>
</li>
<li>Using the Monitor or the TSQL queries from earlier check on the status of the StretchDb feature. After some time query the stretched table in the Azure SQL Database again to make sure the new record is stretched there.</li>
</ol>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJD5_cAiiuzWZff-nlBx7qdkkzX5kLY5Qr1jKdp-wHzSCeDR8D8nNHmsPEkKlwZAb5GGB19e8xooJoS1jJOp2Xo77SVGXAm3i00-A3hTBveYh7rKMFvdKXZEjEF62_8aOUVLsr8dzvFrtN/s1600/005.DataInAzureDbStretchedTable.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="117" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJD5_cAiiuzWZff-nlBx7qdkkzX5kLY5Qr1jKdp-wHzSCeDR8D8nNHmsPEkKlwZAb5GGB19e8xooJoS1jJOp2Xo77SVGXAm3i00-A3hTBveYh7rKMFvdKXZEjEF62_8aOUVLsr8dzvFrtN/s320/005.DataInAzureDbStretchedTable.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div>
<br /></div>
Now the fun part starts. Lets simulate a database migration. In my lab I am just using the same server, but the experience is the same.</div>
<div>
<ol style="text-align: left;">
<li>Take a backup of the on-premises StretchDbDemo database. You could elect to disable StretchDb first and bring all the data back on-premises, but we trust the Azure backups to keep our data in the cloud safe.</li>
<li>Next restore the backup to the new database we are migrating the application to. I have used the database name StretchDbDemoNew.</li>
<li>Once the database is restored it isn't immediately Stretching database as it needs to be re-authorized to use the remote server. This is outlined in <a href="https://msdn.microsoft.com/en-us/library/mt733205.aspx">https://msdn.microsoft.com/en-us/library/mt733205.aspx</a>.</li>
<ol>
<li>First get the name of the credential that exists. If you were migrating this to another server you will need to recreate the credential on that server. The name of the credential needs to match the remote server address.
<pre class="brush:sql">SELECT * FROM sys.database_scoped_credentials</pre>
</li>
<li>Now use the following statement to
<pre class="brush:sql">-- reauthorize using the existing credential information
USE [StretchDBDemoNew];
GO
EXEC sp_rda_reauthorize_db
@credential = N'<your azure="" database="" server="" sql="">',
@with_copy = 0;
GO
</your></pre>
</li>
</ol>
</ol>
</div>
<div>
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgvQvrE_NOnRrs2sy_F_t6kO_kDGBy7Njcz4AxAl9pSQg8EmBYSyURjoBNdc1hq2hwHfhsKDQFxMWu1sgBUu40SCP26zcq6H773OMZDP7o4zAUGAMw1IztarFEm__062st-fd7ihPP8eBo3/s1600/006.ReAuthorize.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="57" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgvQvrE_NOnRrs2sy_F_t6kO_kDGBy7Njcz4AxAl9pSQg8EmBYSyURjoBNdc1hq2hwHfhsKDQFxMWu1sgBUu40SCP26zcq6H773OMZDP7o4zAUGAMw1IztarFEm__062st-fd7ihPP8eBo3/s320/006.ReAuthorize.png" width="320" /></a></div>
<div>
<br /></div>
<div>
Here is where the problem begins. </div>
<div>
When you execute sp_rda_reauthorize_db and specify with_copy = 0, it uses the existing Azure SQL Database as the endpoint. So now we actually have two on-premises databases StretchDBDemo and StretchDBDemoNew, both pointing to the same Azure SQL Database table for stretched data. If you specify with_copy = 1, it actually creates a copy of the data in a new table and therefore the old and new on-premises databases have different stretch data end-points.</div>
<div>
<br /></div>
<div>
So why does this matter. Well lets add some data to the new on-premises database because well we want to make sure it's all work as you would in any migration.</div>
<div>
<br /></div>
<div>
<pre class="brush:sql">INSERT INTO [StretchDBDemoNew].[dbo].[People] ([PersonID], [FullName], [IsEmployee], [PhoneNumber], [EmailAddress]) VALUES (10, 'Bill', 0, '0','yourname@email.com');</pre>
</div>
<div>
<br /></div>
<div>
HINT: Any rows I insert into the new database I will use a PersonID >= 10 as this helps with the demonstration.</div>
<div>
<br /></div>
<div>
Now this should get stretched to the Azure SQL Database. So switch to that query window and check it's there. </div>
<div>
NOTE: it might take a few minutes for the reconciliation to occur.</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgRl7mzT3-oHHNahbra-jymW3ZTVTfXoePeB27apnoykGpjotaKPy0rGTAOBgT2VA5Gj58VAeiqINwKEzHnCiUwt3jTgXj2pI0YTtUMnFieqify1QcWIyr8o5O_ZYM7tDMjdiZQLRC3U-me/s1600/007.DataInAzureDbStretchedTable.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="116" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgRl7mzT3-oHHNahbra-jymW3ZTVTfXoePeB27apnoykGpjotaKPy0rGTAOBgT2VA5Gj58VAeiqINwKEzHnCiUwt3jTgXj2pI0YTtUMnFieqify1QcWIyr8o5O_ZYM7tDMjdiZQLRC3U-me/s320/007.DataInAzureDbStretchedTable.png" width="320" /></a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
Here is where things get interesting. After a few minutes try querying the data again and see if there is any change. While writing this post the remote record disappeared, however when writing the scripts it didn't. So what is happening..... to find out lets compare local and remote data more.</div>
<div>
<br /></div>
<div>
With the new record still in Azure, query the old on-premises database. If your lucky you will find the record inserted into the new database which has been stretched to Azure also returned when you query the original database. Your experience here may differ as it's all a matter of timing.</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhjk6WIT6b5Q4Fy7A0N6Za375bySoAuW3TSJO7-lWERj56C7tv10TDBTznt8fg3YEDGndGcKOctRxrnORu8oUaKYuCqLhDQppO9dQanz3ZA8cBjV_9-qoNZ57CBNnr_bcQ5uEUvgSTfOXSm/s1600/008.DataInAzureDbStretchedTable.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="134" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhjk6WIT6b5Q4Fy7A0N6Za375bySoAuW3TSJO7-lWERj56C7tv10TDBTznt8fg3YEDGndGcKOctRxrnORu8oUaKYuCqLhDQppO9dQanz3ZA8cBjV_9-qoNZ57CBNnr_bcQ5uEUvgSTfOXSm/s320/008.DataInAzureDbStretchedTable.png" width="320" /></a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
Just to add fuel to the fire I inserted another record into my new database that would be stretched to Azure.</div>
<div>
<br /></div>
<div>
<pre class="brush:sql">INSERT INTO [StretchDBDemoNew].[dbo].[People] ([PersonID], [FullName], [IsEmployee], [PhoneNumber], [EmailAddress]) VALUES (11, 'Bart', 0, '0','yourname@email.com');</pre>
</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
But what about our existing on-premises database. Remember in my scenario we are only testing the migration so you would assume there are still active connections querying that database and potentially data being updated which would then stretch into Azure. So lets insert two records into that database which will be stretch.</div>
<div>
<br /></div>
<pre class="brush:sql">INSERT INTO [StretchDBDemo].[dbo].[People] ([PersonID], [FullName], [IsEmployee], [PhoneNumber], [EmailAddress]) VALUES (4, 'Chad', 0, '0','yourname@email.com');
INSERT INTO [StretchDBDemo].[dbo].[People] ([PersonID], [FullName], [IsEmployee], [PhoneNumber], [EmailAddress]) VALUES (5, 'Dan', 0, '0','yourname@email.com');</pre>
<div>
<br /></div>
<div>
So in our existing database we now have this data being returned</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi13iukD6c5YZ3TyCpGdsjMmvdUc6YBs79EIDtc-W03eNw1a-0ZxHcQvKQ0adjHweXMGlSuMXFYv6mWLEmJ5_nG7l8ap75sfcp2rjGDQ-C1Zo7Gg5JniDzHlevySxUp6K2-Ouxrv10T5ZSs/s1600/009.DataInAzureDbStretchedTable.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="160" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi13iukD6c5YZ3TyCpGdsjMmvdUc6YBs79EIDtc-W03eNw1a-0ZxHcQvKQ0adjHweXMGlSuMXFYv6mWLEmJ5_nG7l8ap75sfcp2rjGDQ-C1Zo7Gg5JniDzHlevySxUp6K2-Ouxrv10T5ZSs/s320/009.DataInAzureDbStretchedTable.png" width="320" /></a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
What about in our new database.</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg7_HHMlKWSyTlZErTRiBXi7LOJL387mK8nc4G9z9hP5oS-20KqE75DfezVjLzbQbt5SPIsX5hQ7jAaQBlooFANK4SywPNctAteEQE9JrtFLXbswsZH_92K-8abP4IYR8wrzu7qY9R1lYxb/s1600/010.DataInAzureDbStretchedTable.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="150" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg7_HHMlKWSyTlZErTRiBXi7LOJL387mK8nc4G9z9hP5oS-20KqE75DfezVjLzbQbt5SPIsX5hQ7jAaQBlooFANK4SywPNctAteEQE9JrtFLXbswsZH_92K-8abP4IYR8wrzu7qY9R1lYxb/s320/010.DataInAzureDbStretchedTable.png" width="320" /></a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
Wait a minute, those records 'Chad' and 'Dan' were just inserted into our existing database, and where did our records for 'Bill' and 'Bart' go. </div>
<div>
<br /></div>
<div>
Here is the issue, with both on-premises databases pointing to the same remote database through the with_copy = 0 option, it means that the reconciliation processes in the databases will conflict. One of the databases will effectively become the master and overwrite what records the other sends. As I mentioned both times I ran through the scripts I got different experiences where one time the new database was the master, and the other time (writing this blog) the original database was the master and overwrote the records.</div>
<div>
<br /></div>
<div>
So, the good news is that if you use with_copy = 1 option then the databases use separate remote tables and therefore do not conflict. So my recommendation is ALWAYS USE "WITH_COPY = 1" when reauthorizing a database to the remote server.</div>
<div>
<br /></div>
<div>
In saying that a few things to keep in mind, over time if you move the database a lot you could end up with lots of tables in the remote database so best to monitor that if you want to keep your costs down.</div>
<div>
<br /></div>
<div>
Also if at any stage you are querying the stretched tables in the on-premises databases and you have this conflict situation you could experience a number of cases of this error.</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEimqbyLYG9Q0hnBRabIl0pQbQxe5bA5N6oJZIpSVSTyVle-KwtcLwaL9VjHP_mN2tX_oUwW12KEcdwSguaTRlmpVEPP0mZ5u-aXcPS5l20VIeowygq2Z_9yIDYQ99rlx7KXQv-DnN-J0odR/s1600/011.ReconciliationError.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="20" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEimqbyLYG9Q0hnBRabIl0pQbQxe5bA5N6oJZIpSVSTyVle-KwtcLwaL9VjHP_mN2tX_oUwW12KEcdwSguaTRlmpVEPP0mZ5u-aXcPS5l20VIeowygq2Z_9yIDYQ99rlx7KXQv-DnN-J0odR/s400/011.ReconciliationError.png" width="400" /></a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
I hope this blog post helps when it comes time to planning your migrations. Any questions or feedback please leave them in the comments section. </div>
<div>
<br />
Get the complete demo scripts <a href="http://www.matticus.net/Files/20161220Blog/StretchDbDemo.zip" target="_blank">here</a>.</div>
<div>
<br /></div>
<div>
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em></div>
<div>
<br /></div>
</div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-40141370782742444632016-08-21T14:24:00.001+10:002016-08-23T16:54:12.095+10:00New SQL Server Builds API.... improve your health checks with version and support information<div dir="ltr" style="text-align: left;" trbidi="on">
<div>
Things have been a bit busy for me personally over the last few months but that is another story. What I have done is use some of this time to get back to a little project I started a while back.</div>
<div>
<br /></div>
<div>
Introducing the <b>SQL Server Builds API</b>..... providing version information including release dates, and known support life-cycle information.</div>
<div>
<br />
<br /></div>
<div>
<b><u>Overview</u></b></div>
<div>
The SQL Server Builds API provides SQL Server Version information and known support life-cycle information. As it is a personal project of mine the data is currently manually maintained based on known sources. The main objective of the tool is to enhance health check and project planning type processes, it is not intended for licensing or other legal auditing processes.</div>
<div>
<br /></div>
<div>
In prior roles and particularly in my current role one of the most common checks I perform on a SQL Server is the version and also the patching process. To aid in this what I would do is run reports based off my own custom database of version numbers to check and validate the version of the instance being checked.</div>
<div>
<br />
<br />
<b><u>Querying the API</u></b><br />
To query the API it is pretty simple. The base URI is <i>http://sqlserverbuildsapi.azurewebsites.net/api/SQLVersions</i><br />
<br />
The API takes two parameters<br />
<br />
<ul style="text-align: left;">
<li>Version - <i>This is a string representation of the version of your SQL Instance returned by SELECT SERVERPROPERTY('PRODUCTVERSION') e.g. '12.00.2342.00'<br /></i></li>
<li>MissingUpdates - <i>A boolean value (True/False) of if to include any updates released since the current patch level in the results</i></li>
</ul>
<br />
So if I wanted to query for information about the version number 12.00.2342.00 my completed URI would be something like:<br />
<br />
<i>http://sqlserverbuildsapi.azurewebsites.net/api/SQLVersions?Version=12.00.2342.00&MissingUpdates=false</i><br />
<br />
NOTE: The MissingUpdates param defaults to false so you could omit it from the URI above.<br />
<br />
<br /></div>
<div>
<b><u>Where the data comes from</u></b></div>
<div>
The official source of truth with life-cycle data has to be Microsoft Support Life-cycle (<a href="https://support.microsoft.com/lifecycle">https://support.microsoft.com/lifecycle</a>), and this API/tool is by no means trying to replace that. In fact if you are doing any official auditing for licensing or any other legal means you MUST use that source over this API's data. At least perform your own validation of the data for each record.</div>
<div>
<br /></div>
<div>
Most DBA's and SQL admins would refer to <a href="http://sqlserverbuilds.blogspot.com.au/">http://sqlserverbuilds.blogspot.com.au</a>. I am no different and I would previously refer to that site as well. So, it is only natural that it is still one of the sources I go to for building my API's database.</div>
<div>
<br /></div>
<div>
The database behind this API is currently manually maintained, and even if it was somewhat automated mistakes can happen. If you find I am missing any updates, or if there is any incorrect data please make sure you reach out and let me know.</div>
<div>
<br />
<br /></div>
<div>
<b><u>Suggestions for using the API</u></b></div>
<div>
The API could be used in any number of tools and checks. Here are just some I can think of:</div>
<div>
<ul style="text-align: left;">
<li>Custom health check scripts</li>
<li>Enhance on-premise CMDB data</li>
<li>PowerBI reports</li>
<li>Other custom Applications that interface/manage SQL Server instances</li>
</ul>
</div>
<div>
<br /></div>
<div>
<b><u>PowerShell Examples</u></b></div>
<div>
<br /></div>
<div>
<b><i>#1 - Get version information (static version number)</i></b></div>
<div>
<pre class="brush:ps"># Set the version number to test
$VersionNumber = '12.00.2342.00' # SQL 2014
# Call the API to get the version information
$VersionData = Invoke-RestMethod -Uri "http://sqlserverbuildsapi.azurewebsites.net/api/SQLVersions?Version=$($VersionNumber)";
# at this point simply return the $VersionData variable for the raw info
$VersionData
# Want to improve your Health Check script, calculate the health of Support Status
if ($VersionData.ExtendedSupportEnd -le (Get-Date)) {$SupportStatus = 'Critical'}
elseif ($VersionData.MainstreamSupportEnd -le (Get-Date)) {$SupportStatus = 'Warning'}
else {$SupportStatus = 'Ok'}
# format the output data string
$OutputData = @"
Instance = $($Instance.Name)
Version = $($VersionData.Version)
Product = $($VersionData.Product)
Branch = $($VersionData.Branch)
Update = $($VersionData.Update)
MainstreamSupportEnd = $($VersionData.MainstreamSupportEnd)
ExtendedSupportEnd = $($VersionData.ExtendedSupportEnd)
SupportStatus = $($SupportStatus)
"@
# Return the hashtable
ConvertFrom-StringData -StringData $OutputData;
</pre>
<pre class="brush:ps">
</pre>
</div>
<b><i>#2 - Get Missing updates (static version number)</i></b><br />
<div>
<pre class="brush:ps"># Set the version number to test
$VersionNumber = '11.00.2342.00' # SQL 2012
# Call the API to get the known missing updates (released since current version)
$VersionData = Invoke-RestMethod -Uri "http://sqlserverbuildsapi.azurewebsites.net:80/api/SQLVersions?Version=$($VersionNumber)&MissingUpdates=$true";
# return the output, it is already formated as an Array
$VersionData
# or as a table
$VersionData | Format-Table -Property Version,Branch,Update,@{Name="Released";Expression={Get-Date $_.Released -Format 'yyyy-MM-dd'}},Description -AutoSize
</pre>
</div>
<div>
<br />
<br />
<b><u>Other Examples</u></b><br />
There are more examples in my GitHub repository - <a href="https://github.com/Matticusau/SQLServerBuildsAPI-Examples">https://github.com/Matticusau/SQLServerBuildsAPI-Examples</a><br />
<br />
Including:
</div>
<div>
<br />
<ul style="text-align: left;">
<li>Simple static value examples like above</li>
<li>Both Single and Multiple Instance examples with live version number data direct from a SQL Instance</li>
<li>c# Windows Form client</li>
</ul>
</div>
<div>
Just want to see it in action, I have an MVC form available at <a href="https://sqlserverbuildsportal.azurewebsites.net/">https://sqlserverbuildsportal.azurewebsites.net</a><br />
<br />
<br /></div>
<div>
<b><u>Where is the source code for the API</u></b></div>
<div>
At this time I am not making the API opensource.</div>
<div>
<br />
<br /></div>
<div>
<b><u>Want to partner with me</u></b></div>
<div>
This is a personal project. If you have a similar project or a project that you would like to use this API in please reach out and lets improve the world together.</div>
<div>
<br /></div>
<div>
If you have any questions around this API please make sure you comment or reach out to me through any of my contact means.</div>
<div>
<br /></div>
<div>
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;">Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em></div>
<div>
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;"><br /></em></div>
<div>
<em style="background-color: white; color: #666666; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: xx-small; line-height: 12.6px;"><br /></em></div>
</div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-90230425522168090132016-04-15T10:45:00.000+10:002016-04-15T10:49:04.713+10:00TSQL To identify queries which are candidates for Parameterization<div dir="ltr" style="text-align: left;" trbidi="on">
While discussing the concepts of 'optimize for ad hoc workloads' and 'Forced Parameterization' in SQL Server I decided to modify an existing query I wrote to analyse the query performance from the cache (more on it <a href="http://blog.matticus.net/2013/07/tsql-to-find-top-queries-by-avg-cpu-io.html" target="_blank">here</a>) to now analyse the adhoc queries for candidates which could benefit from parameterization.<br />
<br />
<pre class="brush:sql">
;WITH CTE(QueryCount, StatementTextForExample, plan_handle, QueyHash, QueryPlanHash, CacheObjType,
ObjType)
AS
(
SELECT
COUNT(query_stats.query_hash) AS QueryCount
, MIN(query_stats.query_text) AS StatementTextForExample
, MIN(query_stats.plan_handle) AS plan_handle
, query_stats.query_hash AS QueryHash
, query_stats.query_plan_hash AS QueryPlanHash
, query_stats.CacheObjType
, query_stats.ObjType
FROM
(
SELECT
qs.query_hash
, qs.query_plan_hash
, qs.plan_handle
, cp.cacheobjtype as [CacheObjType]
, cp.objtype as [ObjType]
, SUBSTRING(qt.[text], qs.statement_start_offset/2, (
CASE
WHEN qs.statement_end_offset = -1 THEN LEN(CONVERT(NVARCHAR(MAX), qt.[text])) * 2
ELSE qs.statement_end_offset
END - qs.statement_start_offset)/2
) AS query_text
FROM
sys.dm_exec_query_stats AS qs
INNER JOIN sys.dm_exec_cached_plans cp ON cp.plan_handle = qs.plan_handle
CROSS APPLY sys.dm_exec_sql_text(qs.[sql_handle]) AS qt
WHERE qt.[text] NOT LIKE '%sys.dm_exec_query_stats%'
AND cp.objtype = 'AdHoc'
--AND qs.last_execution_time BETWEEN DATEADD(hour,-1,GETDATE()) AND GETDATE() --change hour time frame
) AS query_stats
GROUP BY query_stats.query_hash
, query_stats.query_plan_hash
, query_stats.CacheObjType
, query_stats.ObjType
HAVING COUNT(query_stats.query_hash) > 1
)
SELECT
CTE.QueryCount
, CTE.CacheObjType
, CTE.ObjType
, CTE.StatementTextForExample
, tp.query_plan AS StatementPlan
, CTE.QueyHash
, CTE.QueryPlanHash
FROM
CTE
OUTER APPLY sys.dm_exec_query_plan(CTE.plan_handle) AS tp
ORDER BY CTE.QueryCount DESC;
--ORDER BY [Total IO] DESC;
</pre>
<br />
When you identify these candidates you then need to look at what is the most appropriate resolution such as.<br />
<br />
<ol style="text-align: left;">
<li>Rewrite the query at the application layer to ensure it is called with parameterization from the data provider</li>
<li>Rewrite the query as a stored procedure</li>
<li>Enable 'optimize for ad hoc workloads' on your SQL instance<br />
<pre class="brush:sql">
EXEC sys.sp_configure N'optimize for ad hoc workloads', N'0'
GO
RECONFIGURE WITH OVERRIDE
GO
</pre>
</li>
<li>Enable 'Forced Parameterization' on the affected database<br />
<pre class="brush:sql">
USE [master]
GO
ALTER DATABASE [AdventureWorksPTO] SET PARAMETERIZATION SIMPLE WITH NO_WAIT
GO
</pre>
</li>
</ol>
Of cause the appropriate option really depends on a larger view of your environment and applications that only you can determine. If you do have any thoughts on this please feel free to add them to the comments below to help others.<br />
<br />
Word of caution too. Before adjusting any sp_configure settings with RECONFIGURE make sure you run the following to check for any settings which are not yet active. It is expected to see 'min server memory (MB)' in the results of this though if you leave the setting default at 0 as SQL must reserve the minimum memory possible which is 16mb.<br />
<pre class="brush:sql">
SELECT * FROM sys.configurations
WHERE Value <> value_in_use
</pre>
<br />
<br />
<span style="color: #666666; font-size: xx-small;"><em>Legal Stuff: The contents of this blog is provided “as-is”. The information, opinions and views expressed are those of the author and do not necessarily state or reflect those of any other company with affiliation to the products discussed. This includes any URLs or Tools. The author does not accept any responsibility from the use of the information or tools mentioned within this blog, and recommends adequate evaluation against your own requirements to measure suitability.</em></span><br />
<span style="color: #666666; font-size: xx-small;"><em></em></span><br />
<br />
<span style="color: green; font-family: Consolas; font-size: medium;"><br />
<span style="color: green; font-family: Consolas; font-size: medium;"><br />
<span style="color: green; font-family: Consolas; font-size: medium;"><br />
</span></span></span></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-86148814868201415052016-03-23T09:02:00.004+10:002016-03-23T09:05:01.883+10:00PowerShell: Finding commands (functions) with Risk Mitigation capabilities<div dir="ltr" style="text-align: left;" trbidi="on">
Yesterday I was talking to a workshop about implementing Risk Mitigation capabilities into your own Advanced Functions in PowerShell. This got me thinking about how to identify the commands that have that capability enabled. More importantly the configuration of the ConfirmImpact setting.<br />
<br />
Background:<br />
If you don't know PowerShell uses a number of inputs when determining how the Risk Mitigation capabilities are applied.<br />
For the WhatIf capability it is purely the presence of the -WhatIf switch parameter.<br />
For the Confirm capability it is a combination of the ConfirmImpact value in the [CmdLetBinding()] attribute and the $ConfirmPreference variable value. Or the presence of -Confirm parameter<br />
<br />
The following command we show all commands with the RiskMitigation capability.<br />
<pre class="brush:ps">
Get-Command | Where-Object {$PsItem.Parameters.Keys.Count -gt 0 -and $PsItem.Parameters.ContainsKey('WhatIf')}
</pre>
<br />
We can extend this further to retrieve the ConfirmImpact value however this is not exposed through a Parameter on the command object and can only be retrieved from the source definition. Unfortunately for CmdLets this is not exposed as they are compiled. This will work though for Functions so you can check your own commands etc.<br />
<br />
<pre class="brush:ps">
Get-Command -CommandType Function | `
Where-Object {$PsItem.Parameters.Keys.Count -gt 0 -and $PsItem.Parameters.ContainsKey('WhatIf')} | `
Select-Object -Property Name,CommandType,@{Name="Impact";Expression={[Regex]::Match($PSItem.Definition, "(ConfirmImpact='(?<impact>.{1,})'{1})").Groups["Impact"].Value}}
</pre>
<br />
This pipeline command uses the RegEx type accelerator and Match static member but similar could be achieved with the -Match operator.<br />
<br />
<br />
<i><span style="color: #666666; font-size: xx-small;">Legal Stuff: As always the contents of this
blog is provided “as-is”. The information, opinions and views expressed
are those of the author and do not necessarily state or reflect those of
any other company with affiliation to the products discussed. This
includes any URLs or Tools. The author does not accept any
responsibility from the use of the information or tools mentioned within
this blog, and recommends adequate evaluation against your own
requirements to measure suitability.</span></i> </div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-54198929148093709542016-03-02T13:23:00.001+10:002016-03-03T11:43:00.249+10:00PowerShell ISE add-on to toggle collapsible sections in all files<div dir="ltr" style="text-align: left;" trbidi="on">
<b>UPDATE 03/03/2016</b>: It's amazing how quickly this went from a simple script to within an hour yesterday becoming a Module and now released on <a href="https://www.powershellgallery.com/packages/PSISEToggleOutliningAddon" target="_blank">PowerShellGallery </a>and <a href="https://github.com/Matticusau/PSISEToggleOutliningAddon" target="_blank">GitHub</a>. Check out the <a href="https://github.com/Matticusau/PSISEToggleOutliningAddon/blob/master/README.md" target="_blank">instructions here</a> to install this. <br />
--------------- <br />
<br />
One feature I am hoping to see in the PowerShell ISE is the state of collapsed sections to be maintained for "recent" files. It is just annoying when you have a very large file that you are regularly working on to have to collapse all the sections you had previously collapsed each time you open it up..... anyway you can vote on that idea <a href="https://windowsserver.uservoice.com/forums/301869-powershell/suggestions/12523590-presist-the-state-of-collapsed-regions-functions-e" target="_blank">here</a>.<br />
<br />
If you didn't know there is a keyboard shortcut <b>CTRL+M</b> which will toggle the collapsed sections for the current file. Handy little gem that one. We can also programmatically invoke this method with:<br />
<br />
<pre class="brush:ps">$psISE.CurrentFile.Editor.ToggleOutliningExpansion()
</pre>
<br />
This got me thinking, I can easily write an function that itterates through each open Tab/File and collapses it. Even better I can wrap that up into an add-on for the ISE.<br />
<br />
I've added a menu item and also the keyboard shortcut <b>CTRL+SHIFT+M</b> to implement this. <br />
<br />
Save the following code into your Profile script and enjoy!<br />
<br />
<pre class="brush:ps">#requires -Version 4
<#
Script: ISEColapseAllFiles.ps1
Author: Matt Lavery
Created: 02/03/2016
Version: 0.0.1
Change History
Version Who When What
--------------------------------------------------------------------------------------------------
0.0.1 MLavery 02/03/2016 Initial Coding
#>
<#
.SYNOPSIS
Toggles the state of all expandable sections in all open files within the ISE
.DESCRIPTION
Toggles the state of all expandable sections in all open files within the ISE
Works across PowerShell Tabs.
Implements the keyboard shortcut CTRL+SHIFT+M and a menu item in the Add-ons menu.
#>
function Set-ISECollapseAllFiles
{
[CmdletBinding()]
Param()
Foreach ($psISETab in $psISE.PowerShellTabs)
{
Write-Verbose "PS Tab: $($psISETab.DisplayName)";
foreach ($psISEFile in $psISETab.Files)
{
Write-Verbose "PS File: $($psISEFile.DisplayName)";
$psISEFile.Editor.ToggleOutliningExpansion()
}
}
}
# remove the existing menu item if it exists
if ($psISE.CurrentPowerShellTab.AddOnsMenu.Submenus.DisplayName.Contains('Toggle Colapse All Files'))
{
$psISE.CurrentPowerShellTab.AddOnsMenu.Submenus.Remove(($psISE.CurrentPowerShellTab.AddOnsMenu.Submenus.GetEnumerator() | Where-Object DisplayName -EQ 'Toggle Colapse All Files'));
}
# add the add-on menu
$psISE.CurrentPowerShellTab.AddOnsMenu.Submenus.Add('Toggle Colapse All Files',{Set-ISECollapseAllFiles},"CTRL+SHIFT+M");
</pre>
<br />
<br />
<i><span style="color: #666666; font-size: xx-small;">Legal Stuff: As always the contents of this
blog is provided “as-is”. The information, opinions and views expressed
are those of the author and do not necessarily state or reflect those of
any other company with affiliation to the products discussed. This
includes any URLs or Tools. The author does not accept any
responsibility from the use of the information or tools mentioned within
this blog, and recommends adequate evaluation against your own
requirements to measure suitability.</span></i></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-47495538533061418532016-02-25T22:44:00.002+10:002016-03-02T09:34:15.114+10:00Generating Certificates for Desired State Configuration lab/test environments<div dir="ltr" style="text-align: left;" trbidi="on">
A colleague pointed out to me today that it has been a long time since I blogged, and you know what he is right as I have mostly been posting tweets about interesting news. Given I was taking some colleagues through setting up a Desired State Configuration lab environment and ran into an issue with the certificates we were trying to use, and to make it worse the issue is something I have faced in the past but forget due to time lag, this seems like a great topic to blog about.<br />
<br />
Now a search in your favorite search engine would show that this is not an uncommon problem but there isn't many who answer it plain and simple.<br />
<br />
Do not use <b>New-SelfSignedCertificate</b> to generate a certificate for testing DSC deployments (encrypting credentials) on Windows 2012 R2 or you will most likely receive the error:<br />
<br />
<div style="background: navy;">
<ul>
<span style="color: red;">The private key could not be acquired.<br /> + CategoryInfo : NotSpecified: (root/Microsoft/...gurationManager:String) [], Ci <br /> mException<br /> + FullyQualifiedErrorId : MI RESULT 1<br /> + PSComputerName : localhost</span>
</ul>
</div>
<br />
<br />
<b>Solution #1</b>: Download and use the script/function from Script Center - <a href="https://gallery.technet.microsoft.com/scriptcenter/Self-signed-certificate-5920a7c6">https://gallery.technet.microsoft.com/scriptcenter/Self-signed-certificate-5920a7c6</a><br />
<br />
<br />
<b>Solution #2</b>: Generate the certificate with <b>MakeCert.exe</b> from the Visual Studio Command Prompt.<br />
<br />
For example, to make a certificate called DSCDemo.<br />
<br />
<ul><i>makecert.exe -r -pe -n "CN=DSCDemo" -sky exchange -ss my -sr localMachine </i></ul>
Further info on makecert can be found at <a href="https://msdn.microsoft.com/library/bfsktky3%28v=vs.100%29.aspx">https://msdn.microsoft.com/library/bfsktky3%28v=vs.100%29.aspx</a><br />
<br />
The background, I was trying to setup a DSCPull server with configuration for installing a SQL Server Instance using xSqlPs. This required a Credential to be stored in the config for the 'sa' account, which clearly needed to be encrypted as I didn't want to use plain text passwords in the DSC Configuration. We tried all sorts of methods for exporting the certificate from the DSCPull but eventually the clue was in the fact that even on the DSCPull server the certificate reported no Private Key data in the properties.<br />
<br />
<br />
<ul>
<i>PS C:\Users\mattl> get-item Cert:\LocalMachine\my\{thumbprint} | fl *<br />...</i><br />
<i>Extensions : {System.Security.Cryptography.Oid, System.Security.Cryptography.Oid,<br /> System.Security.Cryptography.Oid, System.Security.Cryptography.Oid}<br />FriendlyName : DSCDemo<br />IssuerName : System.Security.Cryptography.X509Certificates.X500DistinguishedName<br />NotAfter : 2/25/2017 4:02:31 AM<br />NotBefore : 2/25/2016 3:42:31 AM<br />HasPrivateKey : True<br /><span style="background-color: yellow;">PrivateKey :</span><br />PublicKey : System.Security.Cryptography.X509Certificates.PublicKey<br />...</i><br />
<i>Issuer : CN=DSCDemo<br />Subject : CN=DSCDemo</i><br />
</ul>
Again the clue here was the fact that the Private Key isn't even displayed on the server where it was generated, so we know there is nothing wrong with the export/import, but actually a problem with the way it was generated.<br />
<br />
A couple of searches and I found the blog post I used last time I faced this issue, yep isn't that annoying. Turns out this is a problem with the New-SelfSignedCertificate cmdlet and when you generate the certificate with MakeCert.exe as per above the Private Key data is visible on the server it is generated on and also on the server it is imported on (as long as you export the private key data too).<br />
<br />
Hope this helps a few others, or at least helps me remember next time I face this problem before I waste a few hours trying to figure it out again ;)<br />
<br />
<br />
BTW in other news did you see the post that WMF 5.0 (aka PowerShell v5) RTM has been re-released. Happy days.<br />
<br />
<i><span style="color: #666666; font-size: xx-small;">Legal Stuff: As always the contents of this
blog is provided “as-is”. The information, opinions and views expressed
are those of the author and do not necessarily state or reflect those of
any other company with affiliation to the products discussed. This
includes any URLs or Tools. The author does not accept any
responsibility from the use of the information or tools mentioned within
this blog, and recommends adequate evaluation against your own
requirements to measure suitability.</span></i><br />
<br /></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-38421516907991124522015-10-29T08:59:00.000+10:002015-10-29T08:59:31.212+10:00Latest SSMS stream lines the process for adding Azure Firewall Rules at connection<div dir="ltr" style="text-align: left;" trbidi="on">
If you haven't heard, Microsoft have broken the release of SQL Management Studio away from the main product. While at this stage it is continuing to be released at the same cadence as the SQL 2016 CTP 2 cycle (e.g. Monthly) the updates are targeting features for existing releases, Azure SQL, and of cause SQL 2016. You can get the latest release @ <a href="http://aka.ms/ssms">http://aka.ms/ssms</a>.<br />
<br />
I have been working on a project of late with an Azure SQL DB back-end and one thing that has always frustrated me given my role keeps me out in the field is that I am constantly having to log into the portal and update the firewall rules....... Well the latest release of SSMS makes this process much easier.<br />
<br />
Now when you try and connect to an Azure SQL Server and a firewall rule on the server blocks your connection, you are no longer prompted with the message informing you that the client is not allowed to connect. Now you are prompted with a dialog asking you to log into Azure and create a firewall rule. It even populates the client IP Address for you and suggests an IP range.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5sbf0TMsgyssOhmXauIsvGLt5heSoEI3_VUFi2czaRPq5nA8wATimDwu0-KDfIMhdma6prSyttCs-onX1QLQfq22F0N8O4pwZVlB1zaWBchlewGDnueKD5G6sFdXN4IgKvS71xfHVobny/s1600/SSMS-AzureConnAddFirewallRule-01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="167" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5sbf0TMsgyssOhmXauIsvGLt5heSoEI3_VUFi2czaRPq5nA8wATimDwu0-KDfIMhdma6prSyttCs-onX1QLQfq22F0N8O4pwZVlB1zaWBchlewGDnueKD5G6sFdXN4IgKvS71xfHVobny/s320/SSMS-AzureConnAddFirewallRule-01.png" width="320" /></a></div>
<br />
One you authenticate, select the appropriate option for the Firewall Rule (e.g. static or IP range). Click OK and SSMS will place a call to the Azure web services to add the firewall rule.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh7R0fqimoqJ83qdbqXUwdvfZTFoo0EX0Y3b9e1daCeUbyUhw09_v7fodQK8Vuf-dPEM7yvwJjl0dYp-5RBJ9tcuwjxmddaQQ9-SQze7skBFh75M-J0hiKXeTeRd5rMe-dG_vfA82EI0-H8/s1600/SSMS-AzureConnAddFirewallRule-02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="277" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh7R0fqimoqJ83qdbqXUwdvfZTFoo0EX0Y3b9e1daCeUbyUhw09_v7fodQK8Vuf-dPEM7yvwJjl0dYp-5RBJ9tcuwjxmddaQQ9-SQze7skBFh75M-J0hiKXeTeRd5rMe-dG_vfA82EI0-H8/s320/SSMS-AzureConnAddFirewallRule-02.png" width="320" /></a></div>
<br />
Once the rule is added, the authentication process continues which is a nice touch because I was half expecting to be prompted to log in again.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTtOAGJhOBTvA32tdXeYysWSJF5xQQ64KiGPoym6fp9zVrbMOYThewJT5qZ8FnNXnNrz5mWUUcz8i_0XocU8BrchBBzXVcppCqbmLlmJR5ECb-rPObgX-Ls8fkBOcue40WS04n7kA3srbW/s1600/SSMS-AzureConnAddFirewallRule-03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTtOAGJhOBTvA32tdXeYysWSJF5xQQ64KiGPoym6fp9zVrbMOYThewJT5qZ8FnNXnNrz5mWUUcz8i_0XocU8BrchBBzXVcppCqbmLlmJR5ECb-rPObgX-Ls8fkBOcue40WS04n7kA3srbW/s1600/SSMS-AzureConnAddFirewallRule-03.png" /> </a></div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: left;">
Obviously for this to work the user then needs the required permissions on the subscription</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
This is a great example of how the team is actively working on improving the toolsets and responding to feedback so make sure you keep the connect items rolling in.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
In other news, I am presenting with a colleague at MS Ignite Australia next month on some of the new toolset features in SQL 2016. If your attending make sure you attend and come say hi as I will be around the Data Den and Exhibition hall throughout the event.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://msftignite.com.au/sessions/session-details/1524" target="_blank"><img alt="https://msftignite.com.au/sessions/session-details/1524" border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhpy1QPxWppbzRDZfusgOozClUPsC9FmhlwmlkxMPmEPyeo5H6hV6gCCVMUX67ulE_xSh3HDuGs2P7NrGY30nTMnbYqLeSczO8wir_w-oqPZx0iJtODL-ZMZfh4TaeqZ_DqYKgjnxByqqap/s1600/104750_speakersigs_v01-01-4.png" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<i><span style="color: #666666; font-size: xx-small;">Legal Stuff: As always the contents of this
blog is provided “as-is”. The information, opinions and views expressed
are those of the author and do not necessarily state or reflect those of
any other company with affiliation to the products discussed. This
includes any URLs or Tools. The author does not accept any
responsibility from the use of the information or tools mentioned within
this blog, and recommends adequate evaluation against your own
requirements to measure suitability.</span></i> </div>
<br /></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-56180473809879511102015-06-30T15:08:00.001+10:002015-06-30T15:08:04.408+10:00SCOM: Sql Server Management Pack 6.6.0.0 released<div dir="ltr" style="text-align: left;" trbidi="on">
The SQL Server Management Pack has been refreshed with all new dashboards, which look to be fully extensible and designed for larger server farms. Full details on how to configure the dashboards in the SQLServerDashboards.docx guide provided at the download links.<br />
<br />
The new dashboards do look sexy though with these screen shots taken from the guides.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEeGXlNp7vhjiuMbYs3TRmPO-U8-DZNdGsAf6yL0qDd2ZHsG6XbsHULJLvGtl71TaEGZKrE9DhXjOKI3GlpYwViQLWwjV1TZjcfuBcmOD7wA2nfzm4NPtZuzS8CYa2OAhHUVATsgxrH-vU/s1600/SQLServerMP6600-01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEeGXlNp7vhjiuMbYs3TRmPO-U8-DZNdGsAf6yL0qDd2ZHsG6XbsHULJLvGtl71TaEGZKrE9DhXjOKI3GlpYwViQLWwjV1TZjcfuBcmOD7wA2nfzm4NPtZuzS8CYa2OAhHUVATsgxrH-vU/s400/SQLServerMP6600-01.png" width="400" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiOV3H_VgTwbPkyUp_Hxr554BQMEMb5j6OyWrIgkVVS05qGQ3_ewim2NhtR4obzAn16cJYMLq0AWxJ5RRJfdRUV_J_cPdKxrsVcXs0xYIX_BFprfSA4jaLTCQY_k9wUfRTpLBeNQyYZNKbv/s1600/SQLServerMP6600-02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="290" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiOV3H_VgTwbPkyUp_Hxr554BQMEMb5j6OyWrIgkVVS05qGQ3_ewim2NhtR4obzAn16cJYMLq0AWxJ5RRJfdRUV_J_cPdKxrsVcXs0xYIX_BFprfSA4jaLTCQY_k9wUfRTpLBeNQyYZNKbv/s400/SQLServerMP6600-02.png" width="400" /></a></div>
<br />
<br />
Version 6.6.0.0 can be downloaded at <br />
<br />
SQL 2005-2012: <a href="https://www.microsoft.com/en-us/download/details.aspx?id=10631">https://www.microsoft.com/en-us/download/details.aspx?id=10631</a><br />
SQL 2014: <a href="http://www.microsoft.com/en-us/download/details.aspx?id=42573">http://www.microsoft.com/en-us/download/details.aspx?id=42573</a><br />
<br />
Again check out the specific guides provided on implementing these dashboards.<br />
<br />
<br />
<i><span style="color: #666666; font-size: xx-small;">Legal Stuff: As always the contents of this
blog is provided “as-is”. The information, opinions and views expressed
are those of the author and do not necessarily state or reflect those of
any other company with affiliation to the products discussed. This
includes any URLs or Tools. The author does not accept any
responsibility from the use of the information or tools mentioned within
this blog, and recommends adequate evaluation against your own
requirements to measure suitability.</span></i> </div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-3471153950361482952015-06-18T22:43:00.000+10:002015-06-18T22:44:04.918+10:00Announcing the release of my SQL Server DSC Resource Module mlSqlPs on PowerShellGallery and Github<div dir="ltr" style="text-align: left;" trbidi="on">
I've been automating the installation of SQL Server components for 10+ years now so it's no surprise I am a fan of PowerShell and Desired State Configuration (DSC). Given my background in large, high volume, and diverse environments, the main objective I always think about is how the process can be easily replicated time and time again but also easily adapted to fit the needs of diverse configurations.<br />
<br />
Unfortunately to date I have found the xSqlPs DSC module falls just short of these needs, but that was also the intent of the DSC Resource Kit Waves so that it would provide the "starting blocks" for organizations to customize and extend the resources. I had already started talking with the team about how to improve the SQL resources, but now that the <a href="http://blogs.msdn.com/b/powershell/archive/2015/04/27/dsc-resource-kit-moved-to-github.aspx" target="_blank">DSC Resource Kit modules has been made open source on github</a> this opens the door for the community to contribute into the standard set of resources. All changes I mention below are pending pull requests back into the main branch of the xSqlPs resource on Github.<br />
<br />
The PowerShell Team is also working on their nuget gallery at PowerShellGallery.com, which is going to greatly assist with the new PowerShellGet feature in PowerShell 5.0. The gallery is still in preview however I am starting to contribute to it and this module will be available there (link below).<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUBcKgEAAh6FXxdZxTmPhRtTAgRcd_PWeRZzPo-dws8IRosfYHP1U3n3W28pSp_rDlGGPKx1KO7PJWTGONnx86fWcURTGJvfGLY34cBm_7HoOjb9srZg6NmcamQPQRgdJZX2qLbLEA9mkl/s1600/June2015Release001.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="88" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUBcKgEAAh6FXxdZxTmPhRtTAgRcd_PWeRZzPo-dws8IRosfYHP1U3n3W28pSp_rDlGGPKx1KO7PJWTGONnx86fWcURTGJvfGLY34cBm_7HoOjb9srZg6NmcamQPQRgdJZX2qLbLEA9mkl/s640/June2015Release001.png" width="640" /></a><br />
<br />
To provide these features to everyone straight away I have also launched my own version of the SQL DSC module called <b>mlSqlPs</b>. This will allow me to get changes out quickly, while also working to merge the best of my work back into the main branch.<br />
<br />
Version 1.0.0.0 of <b>mlSqlPs</b> DSC Module provides the following resources and enhancements.<br />
<ul style="text-align: left;">
<li><b>xSqlServerInstall<span style="color: red;"> - Enhanced</span></b><br />Installs SQL Enterprise on target machine.<br />This resource has been enhanced to provide capabilities for aligning to SQL best practices (e.g. Data, Log, TempDb paths)</li>
<li><b>xSqlHAService</b><br />Enables SQL high availability (HA) service on a given SQL instance.</li>
<li><b>xSqlHAEndpoint</b><br />Configures the given instance of SQL high availability service to listen port 5022 with given name, and assigns users that are allowed to communicate through the SQL endpoint.</li>
<li><b>xSqlHAGroupconfigures</b><br />An SQL HA group. If the HA group does not exist it will create one with the given name on given SQL instance and add the HA group database(s) to local SQL instance.</li>
<li><b>xWaitForSqlHAGroup</b><br />Waits for an SQL HA group to be ready by checking the state of the HA group of a given name in a given interval till either the HA group is discoverable or the number of retries reached its maximum.</li>
<li><b>xSqlAlias<span style="color: red;"> - New</span></b><br />Configures Client Aliases in both native and wow6432node paths. Supports both tcp and named pipe protocols.</li>
</ul>
The module contains a help file in both HTML and MD format, as well as some Sample files. <br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiZdU2B1CGjrpwNq0ypLZtWXnV8eKXM5kC_o1Kz3Jk0tZMq8ZHO-AKvzeA6CgyYL54sGXnhcw4Z_pYLr9ZNu3CeqAz_nIOSHuahA22nMoDlrbvpVg3yve7z0Dolk-PpCFEPClwbETewOIEr/s1600/June2015Release005b.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiZdU2B1CGjrpwNq0ypLZtWXnV8eKXM5kC_o1Kz3Jk0tZMq8ZHO-AKvzeA6CgyYL54sGXnhcw4Z_pYLr9ZNu3CeqAz_nIOSHuahA22nMoDlrbvpVg3yve7z0Dolk-PpCFEPClwbETewOIEr/s1600/June2015Release005b.png" /></a></div>
<br />
<br />
You can get the module at the following locations:<br />
<br />
PowerShell Gallery<br />
<a href="https://www.powershellgallery.com/packages/mlSqlPs/">https://www.powershellgallery.com/packages/mlSqlPs/</a><br />
<br />
GitHub - Want to contribute let me know<br />
<a href="https://github.com/matticusau/mlSqlPs">https://github.com/matticusau/mlSqlPs</a> <br />
<br />
<br />
Are you running <b>Windows PowerShell 5.0</b>? You can also get the module with the new Install-Module cmdlet.<br />
<pre class="brush:ps">#Search for the module on the default gallery
Find-Module -Name mlSqlPs -Repository PSGallery;</pre>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgkm-4PV0nTaYxEzL4BtjNpI_-RtoJgwHJv5QitcwMATxlZnoFDrV-zTY8wlBKbb_9eqVzhuBE-PA8ruubjYLHtO7Fs1h1NcZuV81Fpt58-Qf3LjJvHT2V7Rl-lu_Ks4_Deg2-DvcyBllN-/s1600/June2015Release002.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="60" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgkm-4PV0nTaYxEzL4BtjNpI_-RtoJgwHJv5QitcwMATxlZnoFDrV-zTY8wlBKbb_9eqVzhuBE-PA8ruubjYLHtO7Fs1h1NcZuV81Fpt58-Qf3LjJvHT2V7Rl-lu_Ks4_Deg2-DvcyBllN-/s640/June2015Release002.png" width="640" /></a></div>
<pre class="brush:ps">#Install the module from the gallery
Install-Module -Name mlSqlPs -Repository PSGallery;</pre>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjn2oEQeHenaCQgrR-X2S_rqQ-TTVUgFm6boOXRNnm70abT0u7nA-pvIyTSKA59CTRD-PsTMwcR3DES7Xsi5JASj3nLvekhdtOIrBhIbUXxK1VTbusNAJUcQzbw8hvKERXmSyKYaHwjTCua/s1600/June2015Release003.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="168" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjn2oEQeHenaCQgrR-X2S_rqQ-TTVUgFm6boOXRNnm70abT0u7nA-pvIyTSKA59CTRD-PsTMwcR3DES7Xsi5JASj3nLvekhdtOIrBhIbUXxK1VTbusNAJUcQzbw8hvKERXmSyKYaHwjTCua/s640/June2015Release003.png" width="640" /></a></div>
<br />
<br />
<pre class="brush:ps">#Get the list of resources
Get-DscResource -Module mlSqlPs;</pre>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEga5L0cEHYJ0JQzYU7lWGH6004W-j5z2r5mXSp8mvV6jafTo8ZrQ71EUrfqy5dGtiL8hLLUr868ekWx3a2JcL5X6PF50AMqVkd3No0TR1ADM2oHhP2AxLGpHc96VCFFey3VT3KsNogQKLO0/s1600/June2015Release004.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="90" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEga5L0cEHYJ0JQzYU7lWGH6004W-j5z2r5mXSp8mvV6jafTo8ZrQ71EUrfqy5dGtiL8hLLUr868ekWx3a2JcL5X6PF50AMqVkd3No0TR1ADM2oHhP2AxLGpHc96VCFFey3VT3KsNogQKLO0/s640/June2015Release004.png" width="640" /></a></div>
<br />
<br />
And there you go, create your configuration, push or pull the mof to the node and watch the magic.<br />
<br />
<br />
Keep checking the GitHub repository and PowerShell Gallery for updates on this module.<br />
<br />
<br />
<i><span style="color: #666666; font-size: xx-small;">Legal Stuff: As always the contents of this
blog is provided “as-is”. The information, opinions and views expressed
are those of the author and do not necessarily state or reflect those of
any other company with affiliation to the products discussed. This
includes any URLs or Tools. The author does not accept any
responsibility from the use of the information or tools mentioned within
this blog, and recommends adequate evaluation against your own
requirements to measure suitability.</span></i> <br />
<br /></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0tag:blogger.com,1999:blog-2931150950398014304.post-86173433830206234332015-06-05T00:53:00.000+10:002015-06-05T14:16:16.623+10:00SCOM: Fixing a blank report list in Ops Mgr Console<div dir="ltr" style="text-align: left;" trbidi="on">
<div class="separator" style="clear: both; text-align: left;">
Something I came across in SCOM 2012 R2 is that when users try and view Reports from the OpsMgr console wunderbar they could receive a blank list.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_i3jD_4PXwnT7eZc7pz26siMgZ5xNJrmlTXRvlQgh60L26w9Ju1SUtchIhsfc4F6WcXoWas4BMXcpN0aXX-XXmtgJ2poRoTNftNOf8ZVKMKgyKZMgxjN9KBOfURL-usrCR4jHZSxdSyAd/s1600/SCOMReportsNotListing-01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="275" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_i3jD_4PXwnT7eZc7pz26siMgZ5xNJrmlTXRvlQgh60L26w9Ju1SUtchIhsfc4F6WcXoWas4BMXcpN0aXX-XXmtgJ2poRoTNftNOf8ZVKMKgyKZMgxjN9KBOfURL-usrCR4jHZSxdSyAd/s400/SCOMReportsNotListing-01.png" width="400" /></a></div>
<br />
<u><b>Possible Cause #1 - Incorrect SSRS file system permissions</b></u><br />
<div class="separator" style="clear: both; text-align: left;">
One thing that can impact this is incorrect file system permissions on the SSRS installation location.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
1. On the server where the SSRS instance is installed, navigate to the SSRS install folder. Depending on your environment this could be the default of the following or a custom drive:</div>
<br />
<div class="separator" style="clear: both; text-align: left;">
<i>C:\Program Files\Microsoft SQL Server\MSRS<version>.<instance>\Reporting Services</instance></version></i></div>
<br />
Where <version> is the numerical version of SQL (e.g. 11) and <instance> is the instance name. Such as \MSRS11.MSSQLSERVER\Reporting Services<br /> </instance></version><br />
<div class="separator" style="clear: both; text-align: left;">
In my lab I customised this to the root of a partition like:</div>
<i>D:\MSRS11.MSSQLSERVER\Reporting Services </i><br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhvGQFBOI7A3bumb3JMa6MGhLsYehsbfSrl_oOU6hl58ftVNSg5rdVAgHneiQsy12ZWuyC2xVjZhyphenhyphenDf_q2S5867XS1U4AFFeIeJk5N-eSZO7y9XN5vckc04OMX_fk78CqIgi7jdDP1W-Tb2/s1600/SCOMReportsNotListing-09.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="110" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhvGQFBOI7A3bumb3JMa6MGhLsYehsbfSrl_oOU6hl58ftVNSg5rdVAgHneiQsy12ZWuyC2xVjZhyphenhyphenDf_q2S5867XS1U4AFFeIeJk5N-eSZO7y9XN5vckc04OMX_fk78CqIgi7jdDP1W-Tb2/s400/SCOMReportsNotListing-09.png" width="400" /></a></div>
<br />
<br />
2. Open the properties of /ReportManager and /ReportServer separately<br />
<br />
3. Ensure that a group which would contain the Windows Accounts of the users trying to access the reports has Read & Execute permissions to the relevant folders.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhDh7UaMr3ce4KJKjydB-T2eOxGBu20-3Kq_1bTVLTcpGXr3CHKOn-XWUlPsGA_uYxMWRk0qGOZuRgLeeGI1BRPzrgeFw5KQfqN-G9EVB9G0XwYi8GL9F86jz7xFtYQ_Zf7QqCqz3M7YZzl/s1600/SCOMReportsNotListing-10.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhDh7UaMr3ce4KJKjydB-T2eOxGBu20-3Kq_1bTVLTcpGXr3CHKOn-XWUlPsGA_uYxMWRk0qGOZuRgLeeGI1BRPzrgeFw5KQfqN-G9EVB9G0XwYi8GL9F86jz7xFtYQ_Zf7QqCqz3M7YZzl/s320/SCOMReportsNotListing-10.png" width="307" /></a></div>
<br />
<br />
This is from my lab where it is the default, you may choose to secure this back to a group(s) containing just the users granted rights to your reports but you need to ensure these permissions exist.<br />
<br />
<br />
<br />
<u><b>Possible Cause #2 - Corrupted identities</b></u> <br />
If you face this issue are you find that some users, or even just the SCOM Administrator, can access the reports but others cannot. This could be following a restore of the "Reporting" component or even a restore/migration/upgrade of the whole SCOM environment.<br />
<br />
The key to understand the cause of this is to look at how both SCOM and SSRS configure security.<br />
<br />
First access the Administrator section from the OpsMgr Console wunderbar, and view the User Roles. Open the properties of the Report Operator User Role which contains the users who receive a blank list. Click on the Identity tab.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi81kTd8L3V-8oPY3IonQOWgTK8XEEA0rqQP0tU0rBlyeJbsWRSRSMIKZvxOl-gWJKK2GNmat8owsBQ0TJMsnrLM4rGep-dSGtqyQ0AN0_TcVOEcUwVZmndJ0QRQTRoE2bLZG310jh7vSJA/s1600/SCOMReportsNotListing-02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="180" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi81kTd8L3V-8oPY3IonQOWgTK8XEEA0rqQP0tU0rBlyeJbsWRSRSMIKZvxOl-gWJKK2GNmat8owsBQ0TJMsnrLM4rGep-dSGtqyQ0AN0_TcVOEcUwVZmndJ0QRQTRoE2bLZG310jh7vSJA/s400/SCOMReportsNotListing-02.png" width="400" /></a></div>
<br />
Take note of the ID which is displayed. Copy this to the clipboard.... you will need it ;)<br />
<br />
Now navigate to your Report Server url (e.g. http://localhost/Reports).<br />
Click on the Folder Settings button from the home screen. <br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgxn0cN9g_fCBqty9iB4HhbZSZdwfV9ZD-BLUFY2WDzfjwrkHwP63zt-qxNyfCYNG7jSkZD8NZOIxJ7KU3-NZUKH-AqYa6Zz9r_AP8o3WC4jAAj1HRWdVm7B_I_rdxN0J07V4OwyHTgqR9m/s1600/SCOMReportsNotListing-03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="71" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgxn0cN9g_fCBqty9iB4HhbZSZdwfV9ZD-BLUFY2WDzfjwrkHwP63zt-qxNyfCYNG7jSkZD8NZOIxJ7KU3-NZUKH-AqYa6Zz9r_AP8o3WC4jAAj1HRWdVm7B_I_rdxN0J07V4OwyHTgqR9m/s400/SCOMReportsNotListing-03.png" width="400" /></a></div>
<br />
Here you will be shown the security of the reports. Check if the ID from earlier is shown, chances are your list will contain a different set of IDs.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjch25qcocUAMVMFtMnLVpQ2OvbP6fu1msBy81o3B1xevB_PvIaGWBtFlDmfiSTxe1oBDMom6P2RxANcPLyh4691GYCrWcSj015XlKnAq98erM42a96IrLrfjUp5Uc8TCK5qrLIIIttPaMY/s1600/SCOMReportsNotListing-04.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="76" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjch25qcocUAMVMFtMnLVpQ2OvbP6fu1msBy81o3B1xevB_PvIaGWBtFlDmfiSTxe1oBDMom6P2RxANcPLyh4691GYCrWcSj015XlKnAq98erM42a96IrLrfjUp5Uc8TCK5qrLIIIttPaMY/s400/SCOMReportsNotListing-04.png" width="400" /></a></div>
<br />
IMPORTANT: Do not remove any role assignments<br />
<br />
Click the "New Role Assignment" button.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEixsckS5P3L5WWqytWtX5yxJ0aR5oJDWPX9_GH9SV2DsKdXZ-XdQFybVbywpj4zG8LvNNnn19a4e8L_o9XCwl8FKW-dt9KzBYga0tJrV7yUPpW3Mfo1RpzmF4XW4m8ud8Hr9-86uj4uXYwR/s1600/SCOMReportsNotListing-05.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEixsckS5P3L5WWqytWtX5yxJ0aR5oJDWPX9_GH9SV2DsKdXZ-XdQFybVbywpj4zG8LvNNnn19a4e8L_o9XCwl8FKW-dt9KzBYga0tJrV7yUPpW3Mfo1RpzmF4XW4m8ud8Hr9-86uj4uXYwR/s1600/SCOMReportsNotListing-05.png" /></a></div>
<br />
Enter the ID into the "Group of user name" and select Browser, My Reports, and Report Builder roles.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjO3agCCu_mwlpRK4KvbN1X91bsPegVt6MOu8lXyHlB3MvvEEWx9HYV49NSRiD91twB6DKMhdUMLM-uCvPaaXPgu4NhEZW852Lo_XcihcFZqY4ZjYCBKxIOA-xZmoYxqWXxJ9zkXhzEGYnO/s1600/SCOMReportsNotListing-06.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="163" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjO3agCCu_mwlpRK4KvbN1X91bsPegVt6MOu8lXyHlB3MvvEEWx9HYV49NSRiD91twB6DKMhdUMLM-uCvPaaXPgu4NhEZW852Lo_XcihcFZqY4ZjYCBKxIOA-xZmoYxqWXxJ9zkXhzEGYnO/s400/SCOMReportsNotListing-06.png" width="400" /></a></div>
<br />
Click OK<br />
<br />
The security assignments should now contain the ID of your User Role from OpsMgr.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhbucKgtIWZzQJg1Qhlgu9ovAyQD5L38hve3iMD2Nikjlitfn8iEEqrW3CN5gd-BRLClTHsvONhtRNRfPY-CmXgRsYyeJDESorVJHJF6yEZ6EEnonZPDeeotqV0SvnU6yiNS7oRmbv3GJnf/s1600/SCOMReportsNotListing-07.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="76" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhbucKgtIWZzQJg1Qhlgu9ovAyQD5L38hve3iMD2Nikjlitfn8iEEqrW3CN5gd-BRLClTHsvONhtRNRfPY-CmXgRsYyeJDESorVJHJF6yEZ6EEnonZPDeeotqV0SvnU6yiNS7oRmbv3GJnf/s400/SCOMReportsNotListing-07.png" width="400" /></a></div>
<br />
Now if you have your user refresh the Reports in Ops Mgr Console they should have access to the reports. At least that was the solution I found in my lab.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgzSven_xsJ4GNKO9ZoBga6S94bCbgzAKSvZ8RL0qDn108I00eRdWOYRcVkECuuF86D-B_7B86DthuPgS_59MQxWY4rAdITaRbOhv4W-1VapD-TvOCv-SC9iNMl8Xmta4y20VYBUifWFoH4/s1600/SCOMReportsNotListing-08.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="272" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgzSven_xsJ4GNKO9ZoBga6S94bCbgzAKSvZ8RL0qDn108I00eRdWOYRcVkECuuF86D-B_7B86DthuPgS_59MQxWY4rAdITaRbOhv4W-1VapD-TvOCv-SC9iNMl8Xmta4y20VYBUifWFoH4/s400/SCOMReportsNotListing-08.png" width="400" /></a></div>
<br />
If you find other causes let me know :)<br />
<br />
<br />
<u><b>Reference articles/blogs:</b></u><br />
Here are some other articles I have since found which provide troubleshooting guidance on this topic as well for your reference.<br />
<br />
<a href="http://thoughtsonopsmgr.blogspot.com.au/2011/09/empty-reports-in-scom-whats-happening.html">http://thoughtsonopsmgr.blogspot.com.au/2011/09/empty-reports-in-scom-whats-happening.html</a><br />
<a href="http://thoughtsonopsmgr.blogspot.ch/2010/08/how-to-scope-scom-r2-reports-for.html">http://thoughtsonopsmgr.blogspot.ch/2010/08/how-to-scope-scom-r2-reports-for.html</a><br />
<br />
<br />
<br />
<br />
<br />
<i><span style="color: #666666; font-size: xx-small;">Legal Stuff: As always the contents of this
blog is provided “as-is”. The information, opinions and views expressed
are those of the author and do not necessarily state or reflect those of
any other company with affiliation to the products discussed. This
includes any URLs or Tools. The author does not accept any
responsibility from the use of the information or tools mentioned within
this blog, and recommends adequate evaluation against your own
requirements to measure suitability.</span></i> <br />
<br />
<br />
<br />
<br /></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com1tag:blogger.com,1999:blog-2931150950398014304.post-76595936222494331052015-05-29T00:00:00.002+10:002015-05-29T00:09:58.327+10:00SQL Server 2016: Query Store first looks<div dir="ltr" style="text-align: left;" trbidi="on">
In case you missed the announcement SQL Server 2016 CTP2 preview was publicly announced today. You can read about it in the following post:<br />
<br />
<a href="http://blogs.technet.com/b/dataplatforminsider/archive/2015/05/27/sql-server-2016-first-public-preview-now-available.aspx">http://blogs.technet.com/b/dataplatforminsider/archive/2015/05/27/sql-server-2016-first-public-preview-now-available.aspx</a><br />
<br />
<br />
<br />
There are already some training materials up on <a href="https://technet.microsoft.com/en-us/virtuallabs">https://technet.microsoft.com/en-us/virtuallabs</a><br />
<br />
The first new feature I noticed was a new type of actual query plan to include, <b>Live Query Statistics</b>.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi9JLN-DBQXlgyrk6-D3vMJ8SpGKnQfCQeWIiNLFGhl1GQ2he2BHDlXUIU9-GAEOSFLQPFj7kJ8P1RjSglpFha0A6ZsyjpFJe-UDkiOjmqqHbp14_QRTKfhSYNrR5bRM9oLt3hCFu_iWFFL/s1600/SQL2016CTP2-QueryStore04.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi9JLN-DBQXlgyrk6-D3vMJ8SpGKnQfCQeWIiNLFGhl1GQ2he2BHDlXUIU9-GAEOSFLQPFj7kJ8P1RjSglpFha0A6ZsyjpFJe-UDkiOjmqqHbp14_QRTKfhSYNrR5bRM9oLt3hCFu_iWFFL/s1600/SQL2016CTP2-QueryStore04.png" /></a></div>
<br />
When enabled the actual query plan is displayed as the query is executed, however it now also displays the real time progress as each stage of the query is executed. You can see this in the following screen shot where the Sort is at 65% (and subsequently all tasks after it are also at 65%). I wasn't fast enough to capture it before it reached the sort but it really does visually show you real time query execution. I'm going to have to find a really large workload to try it out on ;)<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgjj-JRIt6QqfzWln372wHKlHgrHlsSffHM_8eI_O1cHhju_53hj5emH7X0Vu2yhooKbt6RYCmsegaEtb4fmsNl1mhpBTVZ1_EwMLWnDicOFSehHKnFK9X7rKHSQm-aigkXMcdCalWSA7SD/s1600/SQL2016CTP2-LiveQueryStats02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="121" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgjj-JRIt6QqfzWln372wHKlHgrHlsSffHM_8eI_O1cHhju_53hj5emH7X0Vu2yhooKbt6RYCmsegaEtb4fmsNl1mhpBTVZ1_EwMLWnDicOFSehHKnFK9X7rKHSQm-aigkXMcdCalWSA7SD/s400/SQL2016CTP2-LiveQueryStats02.png" width="400" /></a></div>
<br />
<br />
<br />
One of the new feature <b>Query Store</b> is one I have been "geeking out" on since earlier in the year and I am so glad I can now talk about it publicly and here is my first look at the feature first hand.<br />
<br />
First off when you view the properties of a database there is a new configuration section of "Query Store" as seen in the following screen shot.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgbkArZ2sPDW7mcpTpDSPwziTfEbvT1aeYNnR-6kryKxye9MgsIlk2rRF1lNeMNM9kyQCpEIGNubQeTWsD9cLVGoyA-GSdtfda-YXHv8cNXIU_xbQ6VmZnFZBrBtjCLKabpIOuYha0__vTK/s1600/SQL2016CTP2-QueryStore01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="358" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgbkArZ2sPDW7mcpTpDSPwziTfEbvT1aeYNnR-6kryKxye9MgsIlk2rRF1lNeMNM9kyQCpEIGNubQeTWsD9cLVGoyA-GSdtfda-YXHv8cNXIU_xbQ6VmZnFZBrBtjCLKabpIOuYha0__vTK/s400/SQL2016CTP2-QueryStore01.png" width="400" /></a></div>
<br />
Before you can access this feature you need to enable it on each database. Like all data collection there is a small resource overhead but it's minimal due to the layer of integration so just enable it.... and given the product is in preview status should there be performance impact file a Connect Item and let the product team know.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgtDD7IbQjOjhfRJNIKwrlc-xYKaf6t8jNaAjCaZcNR6T0Q4-b2p38ywMPj0FA1QEU1q4bYD-h_20jVDX9BWngmxKunM-ITiaqDHEJEaBuNUwp1MleKCHn2szIdoTjnTnxr85K3HJTP2hdX/s1600/SQL2016CTP2-QueryStore02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="58" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgtDD7IbQjOjhfRJNIKwrlc-xYKaf6t8jNaAjCaZcNR6T0Q4-b2p38ywMPj0FA1QEU1q4bYD-h_20jVDX9BWngmxKunM-ITiaqDHEJEaBuNUwp1MleKCHn2szIdoTjnTnxr85K3HJTP2hdX/s400/SQL2016CTP2-QueryStore02.png" width="400" /></a></div>
<br />
So now you have enabled it you will have the ability to customise the data frequency, buffer hardening (flush), data retention/grooming, etc. Here are the defaults. <br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj1nnPA7VCZk8D0vVUKyBa_bMfj7-DkTcxC94FVnF2IqwftV1eMHm5VsTO0c-Xhgv4V1Ba63Gxo2UzCcWRXWYzYX2TlevvXMGgRNIYA7B55t2sgb67vbP3W54qTPBVk4wWIEq1ziZpuHxdW/s1600/SQL2016CTP2-QueryStore03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="113" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj1nnPA7VCZk8D0vVUKyBa_bMfj7-DkTcxC94FVnF2IqwftV1eMHm5VsTO0c-Xhgv4V1Ba63Gxo2UzCcWRXWYzYX2TlevvXMGgRNIYA7B55t2sgb67vbP3W54qTPBVk4wWIEq1ziZpuHxdW/s400/SQL2016CTP2-QueryStore03.png" width="400" /></a></div>
In my lab I have adjusted these values to be more granular, for the purpose of this blog, but you will need the find the sweet spot for your environment ;)<br />
<br />
In the earlier screen shot of the database properties you can also see that there are graphical representations of the Query Store storage usage and a button to flush the storage if you need to.<br />
<br />
Now if you refresh the database in Object Explorer you will see a new Query Store branch which contains some useful views.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCIBzdEkhhAkZcZiSKM-1S50DkoFHXunmDR4dOw6eyEhe64kTc8eY9gmW18HVRtRXvem8gfcoyg29Sr2gA1RlaH5wYiz_udIgskOQ1Xd8j4sZ9iVB7uu8mhlXX0f5tsNsLLFqMWN-22ii0/s1600/SQL2016CTP2-QueryStore05.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="345" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCIBzdEkhhAkZcZiSKM-1S50DkoFHXunmDR4dOw6eyEhe64kTc8eY9gmW18HVRtRXvem8gfcoyg29Sr2gA1RlaH5wYiz_udIgskOQ1Xd8j4sZ9iVB7uu8mhlXX0f5tsNsLLFqMWN-22ii0/s400/SQL2016CTP2-QueryStore05.png" width="400" /></a></div>
<br />
These views offer some interesting data that previously we would have had to use extra tools to obtain.<br />
<br />
The following shows the <b>Overall Resource Consumption</b> view.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgTgX75SPIRlEZMvNXWk3wrTgzq4GK17kNGO2HiMDotVqzKh6zvUrulgW-2m0ce0ysvPmXQECciOe4fLCKwHkRQ1qUlcMQnAtXpDjIh0fyOO8R8r7QcoVC7sXlMweeswKUP-SqUlV9x42cl/s1600/SQL2016CTP2-QueryStore06.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="255" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgTgX75SPIRlEZMvNXWk3wrTgzq4GK17kNGO2HiMDotVqzKh6zvUrulgW-2m0ce0ysvPmXQECciOe4fLCKwHkRQ1qUlcMQnAtXpDjIh0fyOO8R8r7QcoVC7sXlMweeswKUP-SqUlV9x42cl/s400/SQL2016CTP2-QueryStore06.png" width="400" /></a></div>
<br />
Each of the views has the ability to be configured to adjust the metrics, date range, etc <br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjzogKgO3KqM0_KVhN2EM4aPizrc5nDLa4zFFoSrhJHCHY7LD30turLdKDVEwr66y9rY_XQdgSqZHv_xuhHbSV2kNQjcl5H85XPqZMrMRsoYzOs6PNw-eBuWUa5yoNQPdtnYvA_7hyULTIA/s1600/SQL2016CTP2-QueryStore09.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjzogKgO3KqM0_KVhN2EM4aPizrc5nDLa4zFFoSrhJHCHY7LD30turLdKDVEwr66y9rY_XQdgSqZHv_xuhHbSV2kNQjcl5H85XPqZMrMRsoYzOs6PNw-eBuWUa5yoNQPdtnYvA_7hyULTIA/s320/SQL2016CTP2-QueryStore09.png" width="204" /></a></div>
<br />
<br />
The most interesting of the views in my opinion is the Top Resource Consuming Queries as this shows the top X queries on the top left box, and the query plans which have been used to return the results in the right top box.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjyu4qJxDpsnop7CzFrPs9VjBi8cFT8JnIlBhWA2sFfOVuNizm-YlgQG0ssaDLtdy3o_yy_8hGrROsNDM5ETSA8dVOZkvar5rjBCV0DU6Pl9FDIeV894FRgQsyE-1ebSKBV-8uMkvZabes3/s1600/SQL2016CTP2-QueryStore08.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="226" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjyu4qJxDpsnop7CzFrPs9VjBi8cFT8JnIlBhWA2sFfOVuNizm-YlgQG0ssaDLtdy3o_yy_8hGrROsNDM5ETSA8dVOZkvar5rjBCV0DU6Pl9FDIeV894FRgQsyE-1ebSKBV-8uMkvZabes3/s400/SQL2016CTP2-QueryStore08.png" width="400" /></a></div>
<br />
When you select a "plan id" it displays the cached plan for that id in the lower pane.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgfrjP37kRFNxoMb4Ee1BlqSFUtQC7Y2yY7euW0eGiZrwx5leJb2nJXEnUM68Xt_Bu8Sv9_fG4SFCWxPKvfUvSiVhUClQh-gqGi78xziSmTT-547FHhPQCPOxtsiI095crSjKnKLR8l5_fW/s1600/SQL2016CTP2-QueryStore10.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="225" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgfrjP37kRFNxoMb4Ee1BlqSFUtQC7Y2yY7euW0eGiZrwx5leJb2nJXEnUM68Xt_Bu8Sv9_fG4SFCWxPKvfUvSiVhUClQh-gqGi78xziSmTT-547FHhPQCPOxtsiI095crSjKnKLR8l5_fW/s400/SQL2016CTP2-QueryStore10.png" width="400" /></a></div>
<br />
<br />
Why is the so interesting, well like you can see in this screen shot I have had two plans for the same query. One with Parallelism and one without..... and from the bubble chart I can see which one has executed at times with the highest resource consumption and which has executed with the least resource consumption (based on my charts configuration). So from this data you can then make a choice if you should "force" a plan, or if you have forced the plan previously then you can 'unforce' it. <br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkl-KnRrR9tUQ2Dc3VKrX0wNESocR5sfssQHlssd4r9rF7RXDzdJsMt_-4wdO2ZogEzmNqevR8RlAnXXRypeNfyTPXALEU4gq8uiZHLjyk7d_BEJzwXm1atCUeH2mrYAzfE7LLjRIVEUxD/s1600/SQL2016CTP2-QueryStore11.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="72" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkl-KnRrR9tUQ2Dc3VKrX0wNESocR5sfssQHlssd4r9rF7RXDzdJsMt_-4wdO2ZogEzmNqevR8RlAnXXRypeNfyTPXALEU4gq8uiZHLjyk7d_BEJzwXm1atCUeH2mrYAzfE7LLjRIVEUxD/s320/SQL2016CTP2-QueryStore11.png" width="320" /></a></div>
<br />
This makes all that effort of trying to figure out if a query plan has changed and is causing a degradation in performance, and then having to extract that plan and create the guide for it. Or many other changes in patterns of performance behavior.<br />
<br />
Behind the scenes there is a great deal of data being captured about query execution to which we can access from a number of system tables.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhIYPU9G3fMmPNq4atmXOdXtWtca2UVSGyx0EdOXZbz1smR-p2AO546fMY-vREwzRBnGK9EXOYEagCYpnaqNSS_0rtvwIO9AS-ebpp-AXgN9lm0La6B5jQ6B-lDXRHaRy0i2ft9mdmnCIvG/s1600/SQL2016CTP2-QueryStore12.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="142" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhIYPU9G3fMmPNq4atmXOdXtWtca2UVSGyx0EdOXZbz1smR-p2AO546fMY-vREwzRBnGK9EXOYEagCYpnaqNSS_0rtvwIO9AS-ebpp-AXgN9lm0La6B5jQ6B-lDXRHaRy0i2ft9mdmnCIvG/s400/SQL2016CTP2-QueryStore12.png" width="400" /></a></div>
For example on the table <i><b>sys.query_store_query</b></i> you can see the data around the internals of query times such as compilation time, parse time, etc.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEivtqB2GWyihyslgRS8BCnLO6udR5Y1Lo3FLlWxbZbs5C7ZihjwbAGmQsQZScSAorp2k48jR4AdoTMhzNdR6BFMFbkUbhW1Q_D867WnLEdxAVAtSIYJq9TDEG-nmpPhB4gRFuAoxpuivfy5/s1600/SQL2016CTP2-QueryStore13.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="27" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEivtqB2GWyihyslgRS8BCnLO6udR5Y1Lo3FLlWxbZbs5C7ZihjwbAGmQsQZScSAorp2k48jR4AdoTMhzNdR6BFMFbkUbhW1Q_D867WnLEdxAVAtSIYJq9TDEG-nmpPhB4gRFuAoxpuivfy5/s400/SQL2016CTP2-QueryStore13.png" width="400" /></a></div>
<br />
<br />
I am definitely looking forward to exploring this feature further and all the time I am going to get back when looking into performance issue.<br />
<br />
Obviously this is CTP2 and features/functionality will change before RTM.<br />
<br />
<i><span style="color: #666666; font-size: xx-small;">Legal Stuff: As always the contents of this
blog is provided “as-is”. The information, opinions and views expressed
are those of the author and do not necessarily state or reflect those of
any other company with affiliation to the products discussed. This
includes any URLs or Tools. The author does not accept any
responsibility from the use of the information or tools mentioned within
this blog, and recommends adequate evaluation against your own
requirements to measure suitability.</span></i> <br />
<br /></div>
Matt Laveryhttp://www.blogger.com/profile/07962199415215786355noreply@blogger.com0