Monthly Archives: May 2018

GDPR: Is your SQL Server environment ready? Part 1

Unless you have been living under a rock lately, you have not heard about GDPR. This has been hot topic in any technology website, blogs and social media network for the last past 2 weeks and even more today because this data regulation goes effective today (05/25).

In this post, I will do my best to condense the information about this topic and really focus on what is import, is your SQL Server environment ready for GDPR?

So, what is GDPR?

It is a data regulation fundamentally about protecting and enabling the privacy rights of individuals it also establishes strict global privacy requirements governing how personal data is managed and protected, while respecting individual choice.

Personal data in scope of the regulation can include, but is not limited to, the following:

  • Name
  • Identification number
  • Email address
  • Online user identifier
  • Social media posts
  • Physical, physiological, or genetic information
  • Medical information
  • Location
  • Bank details
  • IP address
  • Cookies

How is this going to affect my SQL Server environment?

So, if you think how this is going to affect my SQL Server environment or how I shall prepare …. don’t you worry Microsoft has it covered. I personally think Microsoft has done a wonderful job strengthening the security features of SQL Server since the last two versions (2016, 2017), there is multiple white papers, webinars or free online training you can check and also learn more about these new security features as always encrypted, dynamic data masking, row level security.

Because GDPR is a serious matter that involves larger companies across the world, Microsoft created some kind of workflow you may want to adopt to make the GDPR compliance easy to follow up in your company. Here is the visual representation:

GDPR Microsoft

SQL Server provides multiple out of the box solutions that can be implemented to perform each of these recommended processes by Microsoft, so let’s take a look.

Discover:

This step is about locating sensitive information and identify if that information qualifies as personal data according GDPR, you must answer the to the following questions:

  • Which servers and/or databases contain personal data? Which columns or rows can be marked as containing personal data?
  • Who has access to what elements of data in the database system? What elements and features of the database system can be accessed and potentially exploited to gain access to the sensitive data?
  • Where does the data go when it leaves the database?

SQL Server tools you can use during the discover phase:

    • SSMS – Vulnerability assessment, check my previous blog post about this SSMS new feature.
    • Querying sys.columns to identify column names which potentially contain personal information, you can also consider using Full text search to expand your search
    • Check if possible to disable some surface area features as XP_CMDSHELL, CLR, Filestream, Cross DB Ownership Chaining, OLE AUTOMATION, External Scripts, Ad-hoc Distributed Queries .

Manage:

Once personal information is located and gaps in policies of data governance are identified during the discover phase, now its time to implement mechanisms to minimize risks from unauthorized access to data or data loss.

SQL Server tools you can use during the manage phase:

    • SQL Server authentication, there are two modes: Windows authentication mode and mixed mode. Windows authentication is often referred to as integrated security because this SQL Server security model is tightly integrated with Windows, this is the best practice.
    • Create roles to define object level permissions, granting permissions to roles rather than users simplifies security administration. Permissions assigned to roles are inherited by all members of the role, and users can be added or removed to a role to include them in a permission set.
    • Use server-level roles for managing server-level access and security, and database roles for managing database level access.
    • Azure SQL Database firewall, at logical instance and database level. This way only authorized connections have access to the database, and align with the GDPR requirements.
    • Dynamic data masking, to limit sensitive information exposure by masking the data to non-privileged users or applications.
    • Row level security, to restrict access according to specific user entitlements.

Please stay tuned for the part 2 of this post, thanks for reading!

SQL Vulnerability Assessment

Vulnerability assessment is probably one of the most underrated new security features in SSMS 17. This feature is not new to SQL Server though, it already exists on Azure SQL DB but it was finally introduced in SSMS with the version 17.

If you are supporting a SQL Server environment where you have to face an exhaust external or internal IT audit process every quarter or perhaps monthly basis, I think this tool is great helping you to being proactive making you aware of all the security changes or threats you must remediate before auditors arrive to your desk asking for questions.

The only requirement for this new SSMS feature, is to have the 17.4 or latest SSMS version installed on your client machine, and a valid path to store the vulnerability scan result:

According to Microsoft, the VA (let’s use the short name from now on) runs lightweight scan directly to your SQL Server instance or a specific database, looking for security vulnerabilities and deviations from best practices, such as misconfigurations, excessive permissions, and exposed sensitive data. It also checks for security best practices to be in place in order to make your database environment safe.

The VA results contains all the security checks (passed or failed) , organized in three categories: high, medium and low risk.

Report header:

Security checks detail:

If we take a look at the example above, we can see how the Vulnerability assessments provides guidance on how to resolve the failed checks, in case the security check doesn’t apply to your environment you can dismiss such security check and it will no longer appear on future checks.

Let’s dig a little bit more about the security check ID VA1287 – Sensitive data columns should be classified:

As you noticed, the screenshot has a highlighted section where the scan result shows a couple of table columns not in compliance with GDPR. As you may know GDPR is s a regulation in EU law on data protection and privacy, which goes active on 05/25; so this is one major reason why you should  consider using the VA tool.

If you scroll down to the bottom of the AV1287 check, the report provides a detailed description and the impact of having this vulnerability active in your environment:

So, at this point you have your scan results with the description and impact of each one. Now let’s take a look what the tool recommends as remediation:

As you can see there is no rocket science here, the vulnerability scan recommends to implement security measures as Always encrypted or Dynamic Data Masking (SQL Server 16 +) to protect the sensitive information in your environment, and the most important part it helps you to stay away from trouble with your internal or external IT audit.

Let’s not forget about the baseline option:
This is very useful option to avoid false positives, let’s say you determined one of the scan results does not apply in your environment and you don’t want to have it in this report. The solution is click on the “Approve as baseline”, so the next scan will show the same result marked as passed.

Thanks for reading!

Reviewing the SQL Server ERROR log

 

The SQL Server ERROR log is the best source to understand what’s wrong with your SQL Server instance, think this log is like your flight recorder.

In case you are not too familiar with the SQL Server ERROR log, the xp_readerrorlog extended stored procedure is your friend; and as a good friend you must get to know him. In this post, you will find some examples of how to take advantage of this undocumented stored procedure and maximize your time avoiding the utilization of slow and painful tools like Event Viewer from Windows.

First thing first, let’s understand what are the parameters we need to provide to the stored procedure and the most important how we have to use them.

Archive ID:
The first parameter is is about picking the order of the log we want to query:

  • Current log = 0
  • Archived error log #1 = 1
  • Archived error log #2 = 2

Log type:
Then, the second parameter is the SQL Server component we are focusing in:

  • SQL Server Instance Log = 1
  • SQL Agent Log = 0

So, combining these two parameter we can check the current log (0 in the first parameter) for a SQL Server instance (1 in the second parameter) or the SQL Server Agent (2 in the second parameter)

Here is how it looks like in T-SQL code:

-- Checking the current SQL Server instance and SQL Agent logs:
xp_readerrorlog 0,1 -- Remember 1 for SQL Server instance
xp_readerrorlog 0,2 -- Remember 2 fro SQL Server Agent

We also have an optional parameter we can include, it’s a simple search string. This parameter it’s really handy when you are looking for a specific event.

Here are some T-SQL examples how we can combine the first two parameters with a specific search string:

-- Checking for failed logins in our SQL Server instance:
xp_readerrorlog 0,1,'Login failed' --For SQL Server 2012 and below
xp_readerrorlog 0,1,"Login failed" --For SQL Server 2014 onwards

-- Checking for TempDB related entries in our SQL Server instance:
xp_readerrorlog 0,1,'TempDB' --For SQL Server 2012 and below
xp_readerrorlog 0,1,"TempDB" --For SQL Server 2014 onwards

Hopefully this tip could help you to save some time when looking at your SQL Server instance for recent errors or some events from the past that you might need to collect for further analysis.

Stay tuned for more DBA mastery tips.