Author Archives: DBA Master

About DBA Master

I’m very experienced multi platform Database Administrator (Oracle, SQL Server, MySQL), working for large scale US based companies and some other companies around the world. I have multiple Microsoft SQL Server certifications under my belt (MCP, MCTS, MCSA, MCSE), and currently taking the Microsoft Data Science program.

Determine database version from a bak file

Have you ever imagine you can determine the version of SQL Server from a bak or a MDF file? I have been in situations where I received a bak or MDF and LDF files from a vendor to restore \ create a database for the implementation of a new application, most of the times I receive complete detail of the supported versions of SQL Server by the product but what happens if I don’t get that information?

SQL Server offers a way to get the actual version where the files were created, let’s review the process to using a database restore.

RESTORE HEADERONLY
This command does not restores the backup as the name suggest, it just returns metadata (information) about the backup file you are examining.

Syntax:

 RESTORE HEADERONLY FROM DISK = 'physical_backup_device_name'

Here is an example of the RESTORE command and its output with two different files:

The RESTORE command returns a large set of columns (+50) with greater details of the backup file that could be very handy for other kind of scenarios as well. But for now, let’s focus on the DatabaseVersion column which according to Microsoft Docs its just an INT value that represents the database version:

Taking the values from the two RESTORE examples above, the 706 and 782 values doesn’t say too much. After doing a research about the DatabaseVersion column, I didn’t find an official source from Microsoft to explain what these values are. To my surprise I found the information I was looking for from a source that I particularly like and use very much to check SQL Server build numbers, patches and CUs; a website called SQL Server builds

If we look at the values of the Internal Database Version column from the image above, we can easily match the backup file with the SQL Server version where the database was created. In this case the backups I used for my examples were created on SQL Server 2012 (706) and 2014 (782) and this is correct indeed because I used the Adventure Works sample database from Microsoft GitHub repository.

Thanks for reading!

I’m very experienced multi platform Database Administrator (Oracle, SQL Server, MySQL), working for large scale US based companies and some other companies around the world.

I have multiple Microsoft SQL Server certifications under my belt (MCP, MCTS, MCSA, MCSE), and currently taking the Microsoft Data Science program.

Is your SQL Server environment ready for GDPR? Part 2

In my previous blog post of this topic, I talked about the definition of what GDPR is and also described the first two phases of Microsoft’s recommended workflow in order to be in compliance with this data regulation.

The Discovery and Manage phase was about discovering where the sensitive data was located and how it can be accessed also to create access controls to the system in compliance of the “least privilege” principle enabling only authorized access to the database system and data. Here is the visual representation of Microsoft’s GDPR recommended workflow:

Let’s move forward defining and describing the the last two steps:

Protect:

Here is where the data protection effort begins, such task requires proper security controls that are appropriate for the specific data types and usage scenarios.

In my opinion this is the most important phase in the workflow, because is where the data becomes the main focus. The application of protection solutions and monitoring mechanisms are crucial to protect the data.

SQL Server tools you can use during the protect phase:

    • Encryption, which can be applied at different levels (database, server, client)
      • Azure SQL database and Azure SQL Database Warehouse requires encryption for all connections
    • TLS, Microsoft currently supports the 1.2 version
    • TDE, this solution encrypts data at rest is often required by compliance regulations and various industry guidelines as a core requirement, as it ensures the full protection of data at the physical layer.
    • Always encrypted, allows to encrypt sensitive data inside client applications never revealing the encryption keys to the database layer. This is a very good solution when encryption at rest and on the wire is not sufficient.
    • Auditing for Azure SQL Database, tracks database activities by writing events to an audit log. It does require a storage account to save the logs for further analysis.
    • SQL Database threat detection, this is a built-in feature of Azure SQL Database which detects anomalous database activities indicating potential security threats to the database.
    • SQL Vulnerability assessment, part of SSMS starting from the 17.4 version. I have a dedicated blog post about this tool, it scans a SQL instance and its databases looking for data privacy standards.
    • SQL Server audit, tracks database activities and analyze and investigate historical activity to identify potential threats or suspected abuse and security violations.

Report:
This is the last phase of this workflow, deals with the continuous process of reviewing the controls and security of the system, to better ensure ongoing compliance with GDPR principles.

The basis for reporting relies on maintaining documentation and records, so we have a couple of options available to fulfill this requirement:

    • SQL Server audit, maintaining audit logs for all database activities ensures the existence of a persistent record of database access and processing activities at all times. These records can then be analyzed and packaged to provide evidence needed for various record-keeping requirements
    • Temporal tables, a new built-in feature starting from SQL Server 2016. System-versioned temporal tables designed to keep a full history of data changes and allow easy point in time analysis.

Once all the phases are complete, do not forget that the only way to achieving and maintaining compliance to GDPR is through regular checks of the security state of data and systems, to ensure that they meet the standards expected by the organization

Thanks for reading!

I’m very experienced multi platform Database Administrator (Oracle, SQL Server, MySQL), working for large scale US based companies and some other companies around the world.

I have multiple Microsoft SQL Server certifications under my belt (MCP, MCTS, MCSA, MCSE), and currently taking the Microsoft Data Science program.

Creating Wait stats widget on Azure Data Studio for macOS

A couple of weeks ago, Microsoft released a new multi-platform tool called Azure Data Studio, this tool is the final version of SQL Operations Studio. If you are familiar to SQLOps, you probably recall that this tool 100% open source, and because of that you can customize the JSON code to do certain things the way it works best for you.

In my personal opinion, this is of the best features of Azure Data Studio are widgets. It gives the option to DBA’s or Database developers to create their own custom widgets to access SQL Server data using simple charts. I personally don’t like the very buggy reports from SSMS which take time to load and are not fully customizable … they are like a black box to me.

I decided to give it a try and create a custom widget of wait stats, because I use them everyday and also is one of the most common troubleshooting methodologies across SQL Server data professionals. If you are not familiar with wait stats, you should … it provides diagnostic information that can help to determine the root cause or symptom of a potential bottleneck in multiple areas like:

  • Locking \ blocking issues
  • CPU pressure
  • Memory pressure
  • Disk IO problems

In this post, I will walk you through the creation of a custom widget to display the top 5 wait stats from your system but first  you must have Azure Data Studio installed on your computer already. So in case you don’t please make yourself a favor and download the macOS version on this link.

From now on, I will assume you have Azure Data Studio installed also a working connection to a SQL Server instance. All the Azure Data Studio widgets uses a query file as a source, in this case I created a short version of world-famous Paul Randal’s (b|t) script to determine the wait stats for a SQL Server instance, you can download the modified script here from my GitHub repository.

This short version will show only the top five wait types from your system, I also I removed the links from Paul’s version where he explains each ignored wait type just to make the code easier to read but I strongly recommend you to check Paul Randal’s blog post to learn more about.

Download the script and move it to a known location rather than your Downloads folder, this will make things easier while creating the widget. A folder location like /Users/MyUser/Documents/wait_stats.sql would work.

Open the script with Azure Data Studio and execute it, then go to the results window look at the top right corner of this section and click on the Chart icon:

Once the chart is displayed in the screen, look for the chart options from the right panel and make the following changes:

  • Chart Type = horizontalBar
  • Data Direction = Vertical
  • Legend Position = none
  • X Axis Label = PERCENTAGE
  • Y Axis Label = WAIT TYPE

As you may noticed at this point, a nice horizontal bar type char is displayed on your screen, but this is just halfway now let’s move ahead and get the widget created. Click on the Create Insight button at the top of the RESULTS \ CHART grid to generate the JSON code for this chart.

Azure Data Studio will open a new windows with some JSON code, probably you noticed the way it is showed in the screen is not that readable. Don’t you worry, we can address the format problem just select all the JSON code using the ⌘ A (Command + A) shortcut then just run the Format document function of Azure Data Studio using the ⇧⌥ F (Shift + Alt + F) shortcut and voilà the code is now looking perfectly formatted:

Noticed that I have highlighted the name key, because the auto-generated code puts My-Widget as value by default in our case we want to call this widget something like Wait stats, also noticed the queryFile key value it points the location you saved the wait stats script at the beginning of this post.

Finally it’s time to add our custom Wait stats widget to Azure Data Studio, open the Azure Data Studio command palette using the ⌘⇧P (Command + Shift + P) and then write Preferences you will see a list of options displayed as a list choose Preferences: Open User Settings from the list.

In the Search Settings text box write dashboard.server.widgets then click on dashboard.server.widgets and select the Edit option:

The left panel will turn yellow in Edit mode, now look for the last bracket from the last widget on the last panel in my case the all-database-size-server-insight. Carefully add a comma symbol after the last bracket, then copy the JSON code from the other window and paste it. It should look something like this:

Then simply save the changes to the user settings using the ⌘S (Command + S) shortcut, close all the open windows. Re-connect to one of your SQL Server instances, right click then choose manage it will display the server dashboard now just look for the widget we just created.

I hope you find this tip useful, in case you want to grab a copy of the SQL and JSON file used in this post you can find it here.

Stay tuned for more DBA mastery tips!

I’m very experienced multi platform Database Administrator (Oracle, SQL Server, MySQL), working for large scale US based companies and some other companies around the world.

I have multiple Microsoft SQL Server certifications under my belt (MCP, MCTS, MCSA, MCSE), and currently taking the Microsoft Data Science program.