Monthly Archives: October 2018

Is your SQL Server environment ready for GDPR? Part 2

In my previous blog post of this topic, I talked about the definition of what GDPR is and also described the first two phases of Microsoft’s recommended workflow in order to be in compliance with this data regulation.

The Discovery and Manage phase was about discovering where the sensitive data was located and how it can be accessed also to create access controls to the system in compliance of the “least privilege” principle enabling only authorized access to the database system and data. Here is the visual representation of Microsoft’s GDPR recommended workflow:

Let’s move forward defining and describing the the last two steps:


Here is where the data protection effort begins, such task requires proper security controls that are appropriate for the specific data types and usage scenarios.

In my opinion this is the most important phase in the workflow, because is where the data becomes the main focus. The application of protection solutions and monitoring mechanisms are crucial to protect the data.

SQL Server tools you can use during the protect phase:

    • Encryption, which can be applied at different levels (database, server, client)
      • Azure SQL database and Azure SQL Database Warehouse requires encryption for all connections
    • TLS, Microsoft currently supports the 1.2 version
    • TDE, this solution encrypts data at rest is often required by compliance regulations and various industry guidelines as a core requirement, as it ensures the full protection of data at the physical layer.
    • Always encrypted, allows to encrypt sensitive data inside client applications never revealing the encryption keys to the database layer. This is a very good solution when encryption at rest and on the wire is not sufficient.
    • Auditing for Azure SQL Database, tracks database activities by writing events to an audit log. It does require a storage account to save the logs for further analysis.
    • SQL Database threat detection, this is a built-in feature of Azure SQL Database which detects anomalous database activities indicating potential security threats to the database.
    • SQL Vulnerability assessment, part of SSMS starting from the 17.4 version. I have a dedicated blog post about this tool, it scans a SQL instance and its databases looking for data privacy standards.
    • SQL Server audit, tracks database activities and analyze and investigate historical activity to identify potential threats or suspected abuse and security violations.

This is the last phase of this workflow, deals with the continuous process of reviewing the controls and security of the system, to better ensure ongoing compliance with GDPR principles.

The basis for reporting relies on maintaining documentation and records, so we have a couple of options available to fulfill this requirement:

    • SQL Server audit, maintaining audit logs for all database activities ensures the existence of a persistent record of database access and processing activities at all times. These records can then be analyzed and packaged to provide evidence needed for various record-keeping requirements
    • Temporal tables, a new built-in feature starting from SQL Server 2016. System-versioned temporal tables designed to keep a full history of data changes and allow easy point in time analysis.

Once all the phases are complete, do not forget that the only way to achieving and maintaining compliance to GDPR is through regular checks of the security state of data and systems, to ensure that they meet the standards expected by the organization

Thanks for reading!

I’m very experienced multi platform Database Administrator (Oracle, SQL Server, MySQL), working for large scale US based companies and some other companies around the world.

I have multiple Microsoft SQL Server certifications under my belt (MCP, MCTS, MCSA, MCSE), and currently taking the Microsoft Data Science program.

Creating Wait stats widget on Azure Data Studio for macOS

A couple of weeks ago, Microsoft released a new multi-platform tool called Azure Data Studio, this tool is the final version of SQL Operations Studio. If you are familiar to SQLOps, you probably recall that this tool 100% open source, and because of that you can customize the JSON code to do certain things the way it works best for you.

In my personal opinion, this is of the best features of Azure Data Studio are widgets. It gives the option to DBA’s or Database developers to create their own custom widgets to access SQL Server data using simple charts. I personally don’t like the very buggy reports from SSMS which take time to load and are not fully customizable … they are like a black box to me.

I decided to give it a try and create a custom widget of wait stats, because I use them everyday and also is one of the most common troubleshooting methodologies across SQL Server data professionals. If you are not familiar with wait stats, you should … it provides diagnostic information that can help to determine the root cause or symptom of a potential bottleneck in multiple areas like:

  • Locking \ blocking issues
  • CPU pressure
  • Memory pressure
  • Disk IO problems

In this post, I will walk you through the creation of a custom widget to display the top 5 wait stats from your system but first  you must have Azure Data Studio installed on your computer already. So in case you don’t please make yourself a favor and download the macOS version on this link.

From now on, I will assume you have Azure Data Studio installed also a working connection to a SQL Server instance. All the Azure Data Studio widgets uses a query file as a source, in this case I created a short version of world-famous Paul Randal’s (b|t) script to determine the wait stats for a SQL Server instance, you can download the modified script here from my GitHub repository.

This short version will show only the top five wait types from your system, I also I removed the links from Paul’s version where he explains each ignored wait type just to make the code easier to read but I strongly recommend you to check Paul Randal’s blog post to learn more about.

Download the script and move it to a known location rather than your Downloads folder, this will make things easier while creating the widget. A folder location like /Users/MyUser/Documents/wait_stats.sql would work.

Open the script with Azure Data Studio and execute it, then go to the results window look at the top right corner of this section and click on the Chart icon:

Once the chart is displayed in the screen, look for the chart options from the right panel and make the following changes:

  • Chart Type = horizontalBar
  • Data Direction = Vertical
  • Legend Position = none
  • X Axis Label = PERCENTAGE
  • Y Axis Label = WAIT TYPE

As you may noticed at this point, a nice horizontal bar type char is displayed on your screen, but this is just halfway now let’s move ahead and get the widget created. Click on the Create Insight button at the top of the RESULTS \ CHART grid to generate the JSON code for this chart.

Azure Data Studio will open a new windows with some JSON code, probably you noticed the way it is showed in the screen is not that readable. Don’t you worry, we can address the format problem just select all the JSON code using the ⌘ A (Command + A) shortcut then just run the Format document function of Azure Data Studio using the ⇧⌥ F (Shift + Alt + F) shortcut and voilà the code is now looking perfectly formatted:

Noticed that I have highlighted the name key, because the auto-generated code puts My-Widget as value by default in our case we want to call this widget something like Wait stats, also noticed the queryFile key value it points the location you saved the wait stats script at the beginning of this post.

Finally it’s time to add our custom Wait stats widget to Azure Data Studio, open the Azure Data Studio command palette using the ⌘⇧P (Command + Shift + P) and then write Preferences you will see a list of options displayed as a list choose Preferences: Open User Settings from the list.

In the Search Settings text box write dashboard.server.widgets then click on dashboard.server.widgets and select the Edit option:

The left panel will turn yellow in Edit mode, now look for the last bracket from the last widget on the last panel in my case the all-database-size-server-insight. Carefully add a comma symbol after the last bracket, then copy the JSON code from the other window and paste it. It should look something like this:

Then simply save the changes to the user settings using the ⌘S (Command + S) shortcut, close all the open windows. Re-connect to one of your SQL Server instances, right click then choose manage it will display the server dashboard now just look for the widget we just created.

I hope you find this tip useful, in case you want to grab a copy of the SQL and JSON file used in this post you can find it here.

Stay tuned for more DBA mastery tips!

I’m very experienced multi platform Database Administrator (Oracle, SQL Server, MySQL), working for large scale US based companies and some other companies around the world.

I have multiple Microsoft SQL Server certifications under my belt (MCP, MCTS, MCSA, MCSE), and currently taking the Microsoft Data Science program.

Starting SSMS with a specific connection and script file

Have you ever wanted SSMS to start and automatically and establish a connection to your DBA \ hub server and have your daily routine queries already open ready to run?

Good news, that is totally possible. SSMS actually uses the SSMS.exe in the background which has multiple options such as connecting to a specific SQL Server instance and opening a file. All these options are accessible through a few very simple command line parameters, if you are familiar with SQLCmd these parameters will like be really similar to you.

Let’s explore the different options available through command line, open PowerShell or CMD and running the following command:

 ssms -help 

As you may noticed a pop-up window showed up displaying the multiple options as server name, database name, user password, file name, log name, etc..

Now that we know the parameters, let’s have fun with these parameters!! Do you remember what I just mentioned at the beginning of this post? Let’s do a test, let’s call SMS from PowerShell and ask it to open a connection to a specific server (MASTER01 in this case) loading a SQL script file which contains a daily script:

Click for fullscreen

Ok, but you know what let’s make this more interesting, because we know already SSMS.exe it’s an executable and we have all these cool parameters to play with let’s modify the existing SSMS shortcut this time we will make SSMS to load a solution containing multiple SQL script files:

Click for fullscreen

As you may noticed from animation, I just added a set of specific parameters to the existing text, to be specific this is what I just added:

-S master01 
-d master 

This modification will make SSMS to start not showing the traditional splash screen (because yes I want to save a second of my time), it also pre-loads a SQL Server solution which contains a collection of scripts I intents to run from my SQL Server central instance (master01). In case I choose the “master” database as default database to connect to but as you can figure out at this point all these options can be changed 🙂

So this is how you can set a specific configuration for SSMS to use every time you start this application, I consider this tip a very handy trick you can use to save yourself some time.

Stay tuned for more DBA mastery tips.

I’m very experienced multi platform Database Administrator (Oracle, SQL Server, MySQL), working for large scale US based companies and some other companies around the world.

I have multiple Microsoft SQL Server certifications under my belt (MCP, MCTS, MCSA, MCSE), and currently taking the Microsoft Data Science program.